flash-attention/csrc/flash_attn/src
2022-10-21 18:22:27 -07:00
..
fmha Rework dropout to decouple forward and backward 2022-10-21 12:04:27 -07:00
.DS_Store Rename, add benchmarking script 2022-05-26 13:57:38 -07:00
fmha_block_dgrad_fp16_kernel_loop.sm80.cu Implement cross attention 2022-07-03 17:48:12 -07:00
fmha_block_dgrad_kernel_1xN_loop.h Refactor gemm_cl to template on either __half or __nv_bfloat16 2022-07-09 23:18:26 -07:00
fmha_block_fprop_fp16_kernel.sm80.cu Implement cross attention 2022-07-03 17:48:12 -07:00
fmha_block_fprop_kernel_1xN.h Refactor gemm_cl to template on either __half or __nv_bfloat16 2022-07-09 23:18:26 -07:00
fmha_blockmask.h Implement cross attention 2022-07-03 17:48:12 -07:00
fmha_dgrad_fp16_kernel_loop.sm80.cu Replace BOOL_SWITCH with FP16_SWITCH to work around MSVC bug with constexpr variables and templates 2022-10-04 21:31:39 -04:00
fmha_dgrad_kernel_1xN_loop.h Rework dropout to decouple forward and backward 2022-10-21 12:04:27 -07:00
fmha_fprop_fp16_kernel.sm80.cu Don't need to run configure for the forward pass 2022-10-21 18:22:27 -07:00
fmha_fprop_kernel_1xN.h Don't need to run configure for the forward pass 2022-10-21 18:22:27 -07:00
fmha_kernel.h Implement cross attention 2022-07-03 17:48:12 -07:00
fmha_utils.h Implement for bf16 2022-07-09 23:31:56 -07:00
fmha.h Don't need to run configure for the forward pass 2022-10-21 18:22:27 -07:00
fp16_switch.h Fixed switch statement, thanks @yocabon 2022-10-04 21:31:39 -04:00
philox.cuh Rework dropout to decouple forward and backward 2022-10-21 12:04:27 -07:00
static_switch.h Implement attention kernel that splits the batch into two 2022-10-13 20:49:02 -07:00