vllm/csrc/attention
zhaoyang-star 9090bf02e7
Support FP8-E5M2 KV Cache (#2279)
Co-authored-by: zhaoyang <zhao.yang16@zte.com.cn>
Co-authored-by: Zhuohan Li <zhuohan123@gmail.com>
2024-01-28 16:43:54 -08:00
..
attention_dtypes.h Support FP8-E5M2 KV Cache (#2279) 2024-01-28 16:43:54 -08:00
attention_generic.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
attention_kernels.cu Support FP8-E5M2 KV Cache (#2279) 2024-01-28 16:43:54 -08:00
attention_utils.cuh Merge EmbeddedLLM/vllm-rocm into vLLM main (#1836) 2023-12-07 23:16:52 -08:00
dtype_bfloat16.cuh Merge EmbeddedLLM/vllm-rocm into vLLM main (#1836) 2023-12-07 23:16:52 -08:00
dtype_float16.cuh Merge EmbeddedLLM/vllm-rocm into vLLM main (#1836) 2023-12-07 23:16:52 -08:00
dtype_float32.cuh [BugFix] Fix NaN errors in paged attention kernel (#936) 2023-09-04 09:20:06 +09:00
dtype_fp8_e5m2.cuh Support FP8-E5M2 KV Cache (#2279) 2024-01-28 16:43:54 -08:00