This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
780e8eeabb
flash-attention
/
tests
/
modules
History
Tri Dao
93383bd55b
[TP] Implement TensorParallel without sequence parallel
2023-01-07 13:45:22 -08:00
..
test_block_parallel.py
[TP] Implement TensorParallel without sequence parallel
2023-01-07 13:45:22 -08:00
test_embedding_parallel.py
[TP] Implement TensorParallel without sequence parallel
2023-01-07 13:45:22 -08:00
test_mha_parallel.py
[TP] Implement TensorParallel without sequence parallel
2023-01-07 13:45:22 -08:00