vllm/csrc/attention
Andre Slavescu c894836108
[Model] Add support for GPT-J (#226)
Co-authored-by: woWoosuk Kwon <woosuk.kwon@berkeley.edu>
2023-07-08 17:55:16 -07:00
..
attention_dtypes.h Improve setup script & Add a guard for bfloat16 kernels (#130) 2023-05-27 00:59:32 -07:00
attention_generic.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
attention_kernels.cu [Model] Add support for GPT-J (#226) 2023-07-08 17:55:16 -07:00
attention_utils.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
dtype_bfloat16.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
dtype_float16.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
dtype_float32.cuh Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00