vllm/vllm/model_executor
Antoni Baum 005ba458b5
Set torch default dtype in a context manager (#971)
Signed-off-by: Antoni Baum <antoni.baum@protonmail.com>
2023-09-07 15:39:37 +09:00
..
layers [BugFix] Implement RoPE for GPT-J (#941) 2023-09-06 11:54:33 +09:00
models [BugFix] Implement RoPE for GPT-J (#941) 2023-09-06 11:54:33 +09:00
parallel_utils Add Falcon support (new) (#592) 2023-08-02 14:04:39 -07:00
__init__.py [Quality] Add code formatter and linter (#326) 2023-07-03 11:31:55 -07:00
input_metadata.py Add support for BLOOM (#331) 2023-07-03 13:12:35 -07:00
model_loader.py Set torch default dtype in a context manager (#971) 2023-09-07 15:39:37 +09:00
utils.py Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
weight_utils.py Accelerate LLaMA model loading (#234) 2023-08-30 01:00:13 -07:00