flash-attention/tests
2023-01-07 14:37:54 -08:00
..
losses Tweak CrossEntropyLoss to take process_group in init 2022-12-27 10:47:43 -08:00
models [Gen] Test generation with rotary embedding 2023-01-07 14:37:54 -08:00
modules [TP] Implement TensorParallel without sequence parallel 2023-01-07 13:45:22 -08:00
ops [TP] Implement TensorParallel without sequence parallel 2023-01-07 13:45:22 -08:00
test_flash_attn.py Skip flash_attn_split test 2022-11-13 12:27:48 -08:00
test_rotary.py Add MLP, MHA, Block, Embedding modules 2022-11-13 22:06:44 -08:00