This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
e90fc21f2e
vllm
/
vllm
/
lora
History
Zhuohan Li
e90fc21f2e
[Hardware][Neuron] Refactor neuron support (
#3471
)
2024-03-22 01:22:17 +00:00
..
__init__.py
[Experimental] Add multi-LoRA support (
#1804
)
2024-01-23 15:26:37 -08:00
layers.py
[Hardware][Neuron] Refactor neuron support (
#3471
)
2024-03-22 01:22:17 +00:00
lora.py
[Hardware][Neuron] Refactor neuron support (
#3471
)
2024-03-22 01:22:17 +00:00
models.py
[Hardware][Neuron] Refactor neuron support (
#3471
)
2024-03-22 01:22:17 +00:00
punica.py
chore(vllm): codespell for spell checking (
#2820
)
2024-02-21 18:56:01 -08:00
request.py
[Experimental] Add multi-LoRA support (
#1804
)
2024-01-23 15:26:37 -08:00
utils.py
[Experimental] Add multi-LoRA support (
#1804
)
2024-01-23 15:26:37 -08:00
worker_manager.py
Re-enable the 80 char line width limit (
#3305
)
2024-03-10 19:49:14 -07:00