flash-attention/csrc
2023-07-19 08:04:57 +00:00
..
cutlass@c4f6b8c6bc FlashAttention-2 release 2023-07-17 06:21:34 -07:00
flash_attn Fix compile error on MSVC 2023-07-19 08:04:57 +00:00
ft_attention [FT] rotary_cos/sin should have batch_size dimension 2023-07-06 15:33:33 -07:00
fused_dense_lib [FusedDense] Allocate lt_workspace on input device 2023-05-30 14:17:26 -07:00
fused_softmax Add Megatron attention implementation for benchmarking 2022-10-23 23:04:16 -07:00
layer_norm [LayerNorm] Implement LN with parallel residual, support dim 8k 2023-03-31 14:23:45 -07:00
rotary Support H100 for other CUDA extensions 2023-03-15 16:59:27 -07:00
xentropy Support H100 for other CUDA extensions 2023-03-15 16:59:27 -07:00