[Docs] Mention that dropout_layer_norm supports all dims up to 6k

This commit is contained in:
Tri Dao 2022-12-29 23:55:33 -08:00
parent 85b8e3d334
commit 3c7cbfc195

View File

@ -82,7 +82,7 @@ cd ../csrc/rotary && pip install .
```
5. Fused dropout + residual + LayerNorm, adapted from Apex's
[FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm). We add dropout and residual, and make it work for both pre-norm and post-norm architecture.
This only supports a limited set of dimensions, see `csrc/layer_norm/ln_fwd_cuda_kernel.cu`.
This supports dimensions divisible by 8, up to 6144.
```sh
cd ../csrc/layer_norm && pip install .
```