flash-attention/flash_attn/ops
2022-12-25 14:08:21 -08:00
..
triton Add GPT and ViT models 2022-11-13 22:30:23 -08:00
__init__.py Add __init__.py files to subdirectories for installation 2022-11-17 16:55:44 -08:00
fused_dense.py Implement Tensor Parallel for transformer Block 2022-12-25 14:08:21 -08:00
gelu_activation.py Add fused_dense and dropout_add_layernorm CUDA extensions 2022-11-13 21:59:20 -08:00
layer_norm.py Implement BERT 2022-12-18 21:47:27 -08:00