This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
ada4710d70
flash-attention
/
tests
History
Tri Dao
a81900d4c1
[ViT] Minor fix so it runs
2023-08-17 17:25:34 -07:00
..
layers
[Rotary] Implement GPT-J style (interleaved) rotary
2023-03-14 14:35:53 -07:00
losses
Tweak CrossEntropyLoss to take process_group in init
2022-12-27 10:47:43 -08:00
models
[ViT] Minor fix so it runs
2023-08-17 17:25:34 -07:00
modules
Implement ParallelGatedMlp (
#251
)
2023-07-26 12:14:15 -07:00
ops
[LayerNorm] Add test for randomness
2023-07-23 12:31:55 -10:00
test_flash_attn.py
Fix Bwd NaN for varlen when seqlen_q >> seqlen_k and causal
2023-08-16 15:12:36 -07:00
test_rotary.py
Add MLP, MHA, Block, Embedding modules
2022-11-13 22:06:44 -08:00