flash-attention/csrc/layer_norm/README.md
2022-11-28 17:34:40 -08:00

438 B

This CUDA extension implements fused dropout + residual + LayerNorm, based on Apex's FastLayerNorm. We add dropout and residual, and make it work for both pre-norm and post-norm architecture.

This only supports a limited set of dimensions, see csrc/layer_norm/ln_fwd_cuda_kernel.cu.

It has only been tested on A100s.

cd csrc/layer_norm && pip install .