flash-attention/flash_attn/modules
2023-03-28 21:27:00 -07:00
..
__init__.py Add __init__.py files to subdirectories for installation 2022-11-17 16:55:44 -08:00
block.py Implement GPT-J 2023-03-22 16:16:58 -07:00
embedding.py Reorder LN in Block, support OPT 2023-01-15 22:14:31 -08:00
mha.py [FT] Fix FT's single query attention for bf16 hdim128 rotary 2023-03-28 21:27:00 -07:00
mlp.py [FusedDense] Support relu, rename FusedDenseGeluDense -> FusedMLP 2023-01-17 18:12:27 -08:00