vllm/vllm/lora
Jee Li 8af890a865
Enable more models to inference based on LoRA (#3382)
Co-authored-by: Antoni Baum <antoni.baum@protonmail.com>
2024-03-25 18:09:31 -07:00
..
__init__.py [Experimental] Add multi-LoRA support (#1804) 2024-01-23 15:26:37 -08:00
layers.py Enable more models to inference based on LoRA (#3382) 2024-03-25 18:09:31 -07:00
lora.py [CI] Try introducing isort. (#3495) 2024-03-25 07:59:47 -07:00
models.py Enable more models to inference based on LoRA (#3382) 2024-03-25 18:09:31 -07:00
punica.py chore(vllm): codespell for spell checking (#2820) 2024-02-21 18:56:01 -08:00
request.py [Experimental] Add multi-LoRA support (#1804) 2024-01-23 15:26:37 -08:00
utils.py [Experimental] Add multi-LoRA support (#1804) 2024-01-23 15:26:37 -08:00
worker_manager.py [CI] Try introducing isort. (#3495) 2024-03-25 07:59:47 -07:00