flash-attention/csrc/layer_norm/README.md

12 lines
438 B
Markdown
Raw Normal View History

2022-11-14 14:13:44 +08:00
This CUDA extension implements fused dropout + residual + LayerNorm, based on
Apex's [FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm).
We add dropout and residual, and make it work for both pre-norm and post-norm architecture.
2022-11-29 09:31:19 +08:00
This only supports a limited set of dimensions, see `csrc/layer_norm/ln_fwd_cuda_kernel.cu`.
It has only been tested on A100s.
```sh
cd csrc/layer_norm && pip install .
```