cutlass/examples/41_fused_multi_head_attention
dan_the_3rd 2e10404d26
xFormer updates to fMHA FW (#773)
* xFormer updates to fMHA FW

* Convert format to BMHK for '41_fused_multi_head_attention_fixed_seqlen'

* Add missing files

* Remove xFormers specific code

* Update fused_multihead_attention_fixed_seqlen.cu

* rebase and solve conflicts

* remove white space

---------

Co-authored-by: danthe3rd <danthe3rd>
Co-authored-by: Haicheng Wu <haichengw@nvidia.com>
2023-02-08 23:00:10 -05:00
..
gemm New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
iterators xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
attention_scaling_coefs_updater.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
CMakeLists.txt New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
debug_utils.h xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
default_fmha_grouped.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
epilogue_pipelined.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
epilogue_rescale_output.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
epilogue_thread_apply_logsumexp.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
find_default_mma.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
fmha_grouped_problem_visitor.h New updates for 2.11 (#775) 2023-01-20 16:32:57 -05:00
fmha_grouped.h xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
fused_multihead_attention_fixed_seqlen.cu xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
fused_multihead_attention_variable_seqlen.cu xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
gemm_kernel_utils.h xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
kernel_forward.h xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00
mma_from_smem.h xFormer updates to fMHA FW (#773) 2023-02-08 23:00:10 -05:00