From 3c7cbfc1952337932455a5517a17eb77219f092a Mon Sep 17 00:00:00 2001 From: Tri Dao Date: Thu, 29 Dec 2022 23:55:33 -0800 Subject: [PATCH] [Docs] Mention that dropout_layer_norm supports all dims up to 6k --- training/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/training/README.md b/training/README.md index 8f81ff1..d1ada1d 100644 --- a/training/README.md +++ b/training/README.md @@ -82,7 +82,7 @@ cd ../csrc/rotary && pip install . ``` 5. Fused dropout + residual + LayerNorm, adapted from Apex's [FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm). We add dropout and residual, and make it work for both pre-norm and post-norm architecture. -This only supports a limited set of dimensions, see `csrc/layer_norm/ln_fwd_cuda_kernel.cu`. +This supports dimensions divisible by 8, up to 6144. ```sh cd ../csrc/layer_norm && pip install . ```