Commit
Β·
9865789
1
Parent(s):
e6c24fd
Update flash-attention wheel URL in requirements.txt
Browse files- requirements.txt +2 -1
requirements.txt
CHANGED
@@ -1,8 +1,9 @@
|
|
1 |
huggingface_hub
|
|
|
2 |
torch
|
3 |
torchvision
|
4 |
qwen_vl_utils
|
5 |
Pillow
|
6 |
PyMuPDF
|
7 |
accelerate
|
8 |
-
|
|
|
1 |
huggingface_hub
|
2 |
+
transformers
|
3 |
torch
|
4 |
torchvision
|
5 |
qwen_vl_utils
|
6 |
Pillow
|
7 |
PyMuPDF
|
8 |
accelerate
|
9 |
+
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl
|