Update requirements.txt
Browse files- requirements.txt +1 -1
requirements.txt
CHANGED
|
@@ -1,4 +1,4 @@
|
|
| 1 |
-
flash_attn
|
| 2 |
torch
|
| 3 |
transformers
|
| 4 |
gradio
|
|
|
|
| 1 |
+
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
| 2 |
torch
|
| 3 |
transformers
|
| 4 |
gradio
|