| .. | ||
| ln_api.cpp | ||
| ln_bwd_kernels.cuh | ||
| ln_bwd_semi_cuda_kernel.cu | ||
| ln_fwd_cuda_kernel.cu | ||
| ln_fwd_kernels.cuh | ||
| ln_kernel_traits.h | ||
| ln_utils.cuh | ||
| ln.h | ||
| README.md | ||
| setup.py | ||
| static_switch.h | ||
This CUDA extension implements fused dropout + residual + LayerNorm, based on Apex's FastLayerNorm. We add dropout and residual, and make it work for both pre-norm and post-norm architecture.
It has only been tested on A100s.
cd csrc/layer_norm && pip install .