flash-attention/flash_attn
2023-08-27 23:19:58 -07:00
..
layers Run isort and black on python files 2023-08-18 14:22:11 -07:00
losses Run isort and black on python files 2023-08-18 14:22:11 -07:00
models [GPT] Generalize last_token_only arg to num_last_tokens 2023-08-26 20:47:53 -07:00
modules Change causal for CrossAttention in mha.py to align to bottom right 2023-08-26 12:57:33 -07:00
ops Run isort and black on python files 2023-08-18 14:22:11 -07:00
utils [Gen] Clone logits before returning when cg=True 2023-08-27 23:19:58 -07:00
__init__.py Change causal mask to be aligned to bottom-right instead of top-left 2023-08-24 23:41:07 -07:00
bert_padding.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_attn_interface.py Change causal mask to be aligned to bottom-right instead of top-left 2023-08-24 23:41:07 -07:00
flash_attn_triton_og.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_attn_triton.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_blocksparse_attention.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
flash_blocksparse_attn_interface.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
fused_softmax.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
pyproject.toml Move pyproject.toml to flash-attn and tests dir to avoid PEP 517 2023-08-25 15:05:28 -07:00