flash-attention/csrc/layer_norm/README.md

344 B

This CUDA extension implements fused dropout + residual + LayerNorm, based on Apex's FastLayerNorm. We add dropout and residual, and make it work for both pre-norm and post-norm architecture.

It has only been tested on A100s.

cd csrc/layer_norm && pip install .