flash-attention/csrc
2023-03-14 14:35:53 -07:00
..
flash_attn [FA] Remove unused variable rng_engine_inputs 2023-01-25 15:32:40 -08:00
ft_attention [Gen] Pass qkv_stride to ft_attention kernel for batched generation 2023-01-15 15:20:01 -08:00
fused_dense_lib [FusedDense] Support relu, rename FusedDenseGeluDense -> FusedMLP 2023-01-17 18:12:27 -08:00
fused_softmax Add Megatron attention implementation for benchmarking 2022-10-23 23:04:16 -07:00
layer_norm [LayerNorm] Rename x1 -> residual 2023-01-19 13:07:27 -08:00
rotary [Rotary] Implement GPT-J style (interleaved) rotary 2023-03-14 14:35:53 -07:00
xentropy Add smoothing for CrossEntropyParallel, rename to CrossEntropyLoss 2022-12-23 14:51:08 -08:00