Recommend Nvidia's Pytorch container
This commit is contained in:
parent
3cad2ab35d
commit
f0c40b7ddb
@ -48,6 +48,10 @@ Requirements:
|
|||||||
- CUDA 11.4 and above.
|
- CUDA 11.4 and above.
|
||||||
- PyTorch 1.12 and above.
|
- PyTorch 1.12 and above.
|
||||||
|
|
||||||
|
We recommend the
|
||||||
|
[Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch)
|
||||||
|
container from Nvidia, which has all the required tools to install FlashAttention.
|
||||||
|
|
||||||
To install:
|
To install:
|
||||||
```sh
|
```sh
|
||||||
pip install flash-attn
|
pip install flash-attn
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user