vllm/csrc/attention
Zhuohan Li db09d4ad83
[FIX] Fix Alibi implementation in PagedAttention kernel (#945)
* [FIX] Fix Alibi implementation in PagedAttention kernel

* Fix test_attention

* Fix

---------

Co-authored-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>
Co-authored-by: Oliver-ss <yuansongwx@outlook.com>
2023-09-07 15:53:14 -07:00
..
attention_dtypes.h Improve setup script & Add a guard for bfloat16 kernels (#130) 2023-05-27 00:59:32 -07:00
attention_generic.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
attention_kernels.cu [FIX] Fix Alibi implementation in PagedAttention kernel (#945) 2023-09-07 15:53:14 -07:00
attention_utils.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
dtype_bfloat16.cuh [BugFix] Fix NaN errors in paged attention kernel (#936) 2023-09-04 09:20:06 +09:00
dtype_float16.cuh [BugFix] Fix NaN errors in paged attention kernel (#936) 2023-09-04 09:20:06 +09:00
dtype_float32.cuh [BugFix] Fix NaN errors in paged attention kernel (#936) 2023-09-04 09:20:06 +09:00