torch>=2.0.0 transformers>=4.30.0 gradio>=3.50.0 Pillow>=9.0.0 flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl numpy>=1.20.0 tqdm>=4.64.0