From f0c40b7ddb188bccee800628af45214cdced0ad5 Mon Sep 17 00:00:00 2001 From: Tri Dao Date: Fri, 19 May 2023 09:40:21 -0700 Subject: [PATCH] Recommend Nvidia's Pytorch container --- README.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/README.md b/README.md index 70529b5..abe6cf6 100644 --- a/README.md +++ b/README.md @@ -48,6 +48,10 @@ Requirements: - CUDA 11.4 and above. - PyTorch 1.12 and above. +We recommend the +[Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch) +container from Nvidia, which has all the required tools to install FlashAttention. + To install: ```sh pip install flash-attn