This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
665b55e2e2
flash-attention
/
tests
/
ops
History
Tri Dao
665b55e2e2
[LayerNorm] Implement parallel layer norm in Triton
2024-01-04 23:15:35 -08:00
..
triton
[LayerNorm] Implement parallel layer norm in Triton
2024-01-04 23:15:35 -08:00
test_dropout_layer_norm.py
Run isort and black on test files
2023-08-18 20:59:35 -07:00
test_fused_dense_parallel.py
Run isort and black on test files
2023-08-18 20:59:35 -07:00
test_fused_dense.py
Run isort and black on test files
2023-08-18 20:59:35 -07:00