Recommend Nvidia's Pytorch container

This commit is contained in:
Tri Dao 2023-05-19 09:40:21 -07:00
parent 3cad2ab35d
commit f0c40b7ddb

View File

@ -48,6 +48,10 @@ Requirements:
- CUDA 11.4 and above. - CUDA 11.4 and above.
- PyTorch 1.12 and above. - PyTorch 1.12 and above.
We recommend the
[Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch)
container from Nvidia, which has all the required tools to install FlashAttention.
To install: To install:
```sh ```sh
pip install flash-attn pip install flash-attn