flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl git+https://github.com/huggingface/accelerate.git git+https://github.com/huggingface/peft.git transformers-stream-generator gradio_pdf==0.0.22 huggingface_hub albumentations beautifulsoup4 qwen-vl-utils pyvips-binary sentencepiece opencv-python docling-core transformers torch==2.6.0 python-docx torchvision matplotlib tokenizers pdf2image num2words reportlab html2text easydict protobuf markdown requests pymupdf loguru hf_xet spaces pyvips pillow addict gradio einops httpx click oss2 fpdf timm av