This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
e68ebbe89a
flash-attention
/
flash_attn
/
models
History
Tri Dao
e68ebbe89a
Simplify FusedDense
2022-12-22 21:25:31 -08:00
..
__init__.py
Add __init__.py files to subdirectories for installation
2022-11-17 16:55:44 -08:00
bert.py
Simplify FusedDense
2022-12-22 21:25:31 -08:00
gpt.py
Simplify FusedDense
2022-12-22 21:25:31 -08:00
vit.py
[ViT] Use dropout_add_ln for the 1st layer norm
2022-11-23 12:48:56 -08:00