flash-attention/flash_attn
2023-09-05 11:34:13 -07:00
..
layers [Rotary] Implement varlen rotary 2023-09-03 17:57:10 -07:00
losses Run isort and black on python files 2023-08-18 14:22:11 -07:00
models Fix test_baichuan 2023-09-03 21:01:37 -07:00
modules [MLP] Implement SwiGLU with torch jiterator 2023-09-04 15:43:53 -07:00
ops Create __init__.py for ops/triton dir (#516) 2023-09-05 11:29:03 -07:00
utils [Gen] Refactor decoding function 2023-09-04 17:01:38 -07:00
__init__.py Bump to v2.2.0 2023-09-05 11:34:13 -07:00
bert_padding.py add unpad_input_for_concatenated_sequences (#499) 2023-08-29 02:23:56 -07:00
flash_attn_interface.py Support cache_seqlens being integer 2023-09-05 11:27:48 -07:00
flash_attn_triton_og.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_attn_triton.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_blocksparse_attention.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_blocksparse_attn_interface.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
fused_softmax.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
pyproject.toml Move pyproject.toml to flash-attn and tests dir to avoid PEP 517 2023-08-25 15:05:28 -07:00