flash-attention/tests
Grigory Sizov 2a15840f09
Enable paged attention in varlen forward (#831)
* Enable paged attention in varlen forward

* Format + fix padding
2024-03-15 00:48:19 -07:00
..
layers Run isort and black on test files 2023-08-18 20:59:35 -07:00
losses return z_loss (#768) 2024-01-21 15:23:41 -08:00
models Add test for BTLM init 2023-12-25 15:16:27 -08:00
modules Run isort and black on test files 2023-08-18 20:59:35 -07:00
ops [LayerNorm] Rename layernorm.py -> layer_norm.py 2024-01-05 00:21:03 -08:00
pyproject.toml Move pyproject.toml to flash-attn and tests dir to avoid PEP 517 2023-08-25 15:05:28 -07:00
test_flash_attn.py Enable paged attention in varlen forward (#831) 2024-03-15 00:48:19 -07:00
test_rotary.py [Rotary] Implement varlen rotary 2023-09-03 17:57:10 -07:00