2022-12-09 07:40:13 +08:00
|
|
|
This CUDA extension implements fused dropout + residual + LayerNorm, building on
|
2022-11-14 13:52:00 +08:00
|
|
|
Apex's [FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm).
|
|
|
|
|
We add dropout and residual, and make it work for both pre-norm and post-norm architecture.
|
2022-12-09 07:40:13 +08:00
|
|
|
We also make it work for more hidden dimensions (all dimensions divisible by 8, up to 6144).
|
2023-01-07 09:33:11 +08:00
|
|
|
We also implement RMSNorm as an option.
|
2022-11-15 23:10:25 +08:00
|
|
|
|
2022-12-09 07:40:13 +08:00
|
|
|
If you want to use it for dimensions larger than 6k, please file an issue.
|
2022-11-29 09:31:19 +08:00
|
|
|
|
2022-12-09 07:40:13 +08:00
|
|
|
This extension has only been tested on A100s.
|
2022-11-15 23:10:25 +08:00
|
|
|
|
2022-11-14 13:52:00 +08:00
|
|
|
```sh
|
|
|
|
|
cd csrc/layer_norm && pip install .
|
|
|
|
|
```
|