This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
5ab9b3667b
flash-attention
/
.github
/
workflows
History
Tri Dao
d4a7c8ffbb
[CI] Only compile for CUDA 11.8 & 12.2, MAX_JOBS=2,add torch-nightly
2023-11-27 16:21:28 -08:00
..
publish.yml
[CI] Only compile for CUDA 11.8 & 12.2, MAX_JOBS=2,add torch-nightly
2023-11-27 16:21:28 -08:00