flash-attention/tests
2022-10-13 20:49:02 -07:00
..
test_flash_attn.py Implement attention kernel that splits the batch into two 2022-10-13 20:49:02 -07:00