This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
dfe29f5e2b
flash-attention
/
flash_attn
/
utils
History
Tri Dao
dfe29f5e2b
[Gen] Don't use ft_attention, use flash_attn_with_kvcache instead
2023-09-18 15:29:06 -07:00
..
__init__.py
Add __init__.py files to subdirectories for installation
2022-11-17 16:55:44 -08:00
benchmark.py
Run isort and black on python files
2023-08-18 14:22:11 -07:00
distributed.py
Run isort and black on python files
2023-08-18 14:22:11 -07:00
generation.py
[Gen] Don't use ft_attention, use flash_attn_with_kvcache instead
2023-09-18 15:29:06 -07:00
pretrained.py
[GPT] Fix loading weights from HF hub
2023-08-21 22:56:02 -07:00