This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
main
flash-attention
/
tests
History
Ying Zhang
8cbc8a042f
small fixes
2024-09-16 14:54:39 -07:00
..
layers
Run isort and black on test files
2023-08-18 20:59:35 -07:00
losses
[CrossEntropy] Support precomputed LSE
2024-09-08 09:24:43 -07:00
models
Add test for BTLM init
2023-12-25 15:16:27 -08:00
modules
Run isort and black on test files
2023-08-18 20:59:35 -07:00
ops
[LayerNorm] Rename layernorm.py -> layer_norm.py
2024-01-05 00:21:03 -08:00
pyproject.toml
Move pyproject.toml to flash-attn and tests dir to avoid PEP 517
2023-08-25 15:05:28 -07:00
test_flash_attn_ck.py
Support page kvcache in AMD ROCm (
#1198
)
2024-09-15 23:17:28 -07:00
test_flash_attn.py
minor changes to unpad_input test util func
2024-09-16 14:24:11 -07:00
test_rotary.py
minor changes to unpad_input test util func
2024-09-16 14:24:11 -07:00
test_util.py
small fixes
2024-09-16 14:54:39 -07:00