flash-attention/csrc/flash_attn
2022-07-03 17:52:05 -07:00
..
cutlass@319a389f42 Add Cutlass as submodule 2022-06-02 09:54:16 -07:00
src Do P * dP (pointwise) in the bwd in fp32 instead of fp16 2022-07-03 17:52:05 -07:00
fmha_api.cpp Implement cross attention 2022-07-03 17:48:12 -07:00