flash-attention/csrc
2022-10-24 16:04:21 -07:00
..
flash_attn Support all head dims that are multiples of 8, up to 128 2022-10-24 16:04:21 -07:00
fused_softmax Add Megatron attention implementation for benchmarking 2022-10-23 23:04:16 -07:00