flash-attention/tests
Xuechen Li 7fcd3e6a04
map custom model state_dict back to huggingface format (#465)
* fix name.

* set inv function.

* add map back function.

* handle gqa.

* add type annotation to avoid confusion.

* fix docstr.

* test inverse remap logic.
2023-08-18 20:51:39 -07:00
..
layers [Rotary] Implement GPT-J style (interleaved) rotary 2023-03-14 14:35:53 -07:00
losses Tweak CrossEntropyLoss to take process_group in init 2022-12-27 10:47:43 -08:00
models map custom model state_dict back to huggingface format (#465) 2023-08-18 20:51:39 -07:00
modules Implement ParallelGatedMlp (#251) 2023-07-26 12:14:15 -07:00
ops [LayerNorm] Add test for randomness 2023-07-23 12:31:55 -10:00
test_flash_attn.py Fix Bwd NaN for varlen when seqlen_q >> seqlen_k and causal 2023-08-16 15:12:36 -07:00
test_rotary.py Add MLP, MHA, Block, Embedding modules 2022-11-13 22:06:44 -08:00