This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
0399432d68
flash-attention
/
.github
History
Tri Dao
0399432d68
[CI] Use CUDA 12.2.2 instead of 12.2.0
2024-01-21 15:35:57 -08:00
..
workflows
[CI] Use CUDA 12.2.2 instead of 12.2.0
2024-01-21 15:35:57 -08:00