flash-attention/csrc
2023-03-28 21:27:00 -07:00
..
flash_attn Support H100 2023-03-15 14:59:02 -07:00
ft_attention [FT] Fix FT's single query attention for bf16 hdim128 rotary 2023-03-28 21:27:00 -07:00
fused_dense_lib Support H100 for other CUDA extensions 2023-03-15 16:59:27 -07:00
fused_softmax Add Megatron attention implementation for benchmarking 2022-10-23 23:04:16 -07:00
layer_norm Support H100 for other CUDA extensions 2023-03-15 16:59:27 -07:00
rotary Support H100 for other CUDA extensions 2023-03-15 16:59:27 -07:00
xentropy Support H100 for other CUDA extensions 2023-03-15 16:59:27 -07:00