Releases: Dao-AILab/flash-attention
Releases Β· Dao-AILab/flash-attention
v2.7.0.post2
[CI] Pytorch 2.5.1 does not support python 3.8
v2.7.0.post1
[CI] Switch back to CUDA 12.4
v2.7.0
Bump to v2.7.0
v2.6.3
Bump to v2.6.3
v2.6.2
Bump to v2.6.2
v2.6.1
Bump to v2.6.1
v2.6.0.post1
[CI] Compile with pytorch 2.4.0.dev20240514
v2.6.0
Bump v2.6.0
v2.5.9.post1
Limit to MAX_JOBS=1 with CUDA 12.2
v2.5.9
Bump to 2.5.9