flash-attention/flash_attn/modules
2023-04-17 22:34:05 -07:00
..
__init__.py Add __init__.py files to subdirectories for installation 2022-11-17 16:55:44 -08:00
block.py [LayerNorm] Implement LN with parallel residual, support dim 8k 2023-03-31 14:23:45 -07:00
embedding.py Reorder LN in Block, support OPT 2023-01-15 22:14:31 -08:00
mha.py Have a separate nn.Dropout module in SelfAttention module 2023-04-17 22:34:05 -07:00
mlp.py make mlp hidden_features defaults to 4*in_features 2023-04-13 11:08:21 +08:00