[Docs] Mention that dropout_layer_norm supports all dims up to 6k
This commit is contained in:
parent
85b8e3d334
commit
3c7cbfc195
@ -82,7 +82,7 @@ cd ../csrc/rotary && pip install .
|
||||
```
|
||||
5. Fused dropout + residual + LayerNorm, adapted from Apex's
|
||||
[FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm). We add dropout and residual, and make it work for both pre-norm and post-norm architecture.
|
||||
This only supports a limited set of dimensions, see `csrc/layer_norm/ln_fwd_cuda_kernel.cu`.
|
||||
This supports dimensions divisible by 8, up to 6144.
|
||||
```sh
|
||||
cd ../csrc/layer_norm && pip install .
|
||||
```
|
||||
|
||||
Loading…
Reference in New Issue
Block a user