This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
16025d8cc9
flash-attention
/
tests
History
Tri Dao
299563626f
Fix test with alibi and cache_leftpad
2024-07-23 02:04:15 -07:00
..
layers
Run isort and black on test files
2023-08-18 20:59:35 -07:00
losses
return z_loss (
#768
)
2024-01-21 15:23:41 -08:00
models
Add test for BTLM init
2023-12-25 15:16:27 -08:00
modules
Run isort and black on test files
2023-08-18 20:59:35 -07:00
ops
[LayerNorm] Rename layernorm.py -> layer_norm.py
2024-01-05 00:21:03 -08:00
pyproject.toml
Move pyproject.toml to flash-attn and tests dir to avoid PEP 517
2023-08-25 15:05:28 -07:00
test_flash_attn_ck.py
Support AMD ROCm on FlashAttention 2 (
#1010
)
2024-07-22 21:34:37 -07:00
test_flash_attn.py
Fix test with alibi and cache_leftpad
2024-07-23 02:04:15 -07:00
test_rotary.py
Fix spurious re-compilations of
rotary_kernel
(
#911
)
2024-04-05 13:40:41 -07:00
test_util.py
Add var-seq-len to FA3 fp16 / bf16 fwd (
#1072
)
2024-07-22 21:32:41 -07:00