This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
629
Commits
1
Branch
0
Tags
7.9
MiB
3d41db3e2c
Commit Graph
53 Commits
Author
SHA1
Message
Date
Tri Dao
5badfb7848
Implement attention kernel that splits the batch into two
2022-10-13 20:49:02 -07:00
Tri Dao
0c01568daf
Only run backward test for d=128 on A100
2022-10-04 18:06:08 -07:00
Tri Dao
2ed471ecc4
Add tests for numerical error
2022-07-22 17:54:09 -04:00
First
Previous
1
2
Next
Last