flash-attention/.github
2023-11-27 16:21:28 -08:00
..
workflows [CI] Only compile for CUDA 11.8 & 12.2, MAX_JOBS=2,add torch-nightly 2023-11-27 16:21:28 -08:00