This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
663
Commits
1
Branch
0
Tags
7.9
MiB
5018ac6ac5
Commit Graph
2 Commits
Author
SHA1
Message
Date
Tri Dao
dfe29f5e2b
[Gen] Don't use ft_attention, use flash_attn_with_kvcache instead
2023-09-18 15:29:06 -07:00
Kevin Hu
07005806ff
Add BigCode converters (
#532
)
2023-09-10 17:24:50 -07:00