flash-attention/flash_attn
2023-10-24 00:17:34 -07:00
..
layers Fix NameError and typo in ApplyRotaryEmbQKV_ (#569) 2023-09-25 10:47:34 -07:00
losses [CE] Implement CrossEntropyLoss in Triton 2023-09-15 20:05:28 -07:00
models Fix E1136 (#563) 2023-09-21 11:48:23 -07:00
modules [Llama] Fix some tests, add tests for Llama 2 and CodeLlama 2023-09-20 23:36:46 -07:00
ops [CrossEntropy] Fix triton cross_entropy_loss IMA for >=2B elements 2023-10-24 00:17:34 -07:00
utils [Gen] Simplify decode_speculative 2023-09-19 22:20:22 -07:00
__init__.py Bump to v2.3.2 2023-10-08 17:21:29 -07:00
bert_padding.py add unpad_input_for_concatenated_sequences (#499) 2023-08-29 02:23:56 -07:00
flash_attn_interface.py [Gen] Accept cache_batch_idx to index into the KV cache 2023-10-03 16:27:26 -07:00
flash_attn_triton_og.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_attn_triton.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_blocksparse_attention.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_blocksparse_attn_interface.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
fused_softmax.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
pyproject.toml Move pyproject.toml to flash-attn and tests dir to avoid PEP 517 2023-08-25 15:05:28 -07:00