This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
c60851a825
flash-attention
/
tests
/
modules
History
Haodong Lyu
8ee62efca3
Implement ParallelGatedMlp (
#251
)
2023-07-26 12:14:15 -07:00
..
test_block_parallel.py
[FusedDense] Support relu, rename FusedDenseGeluDense -> FusedMLP
2023-01-17 18:12:27 -08:00
test_embedding_parallel.py
[TP] Implement TensorParallel without sequence parallel
2023-01-07 13:45:22 -08:00
test_mha_parallel.py
[TP] Implement TensorParallel without sequence parallel
2023-01-07 13:45:22 -08:00
test_mlp_parallel.py
Implement ParallelGatedMlp (
#251
)
2023-07-26 12:14:15 -07:00