This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
8c6609ae1a
flash-attention
/
tests
/
ops
History
Tri Dao
8c6609ae1a
[LayerNorm] Support all dimensions up to 6k (if divisible by 8)
2022-12-09 02:06:22 -08:00
..
test_dropout_layer_norm.py
[LayerNorm] Support all dimensions up to 6k (if divisible by 8)
2022-12-09 02:06:22 -08:00
test_fused_dense.py
Add fused_dense and dropout_add_layernorm CUDA extensions
2022-11-13 21:59:20 -08:00