This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
a979d9771e
vllm
/
vllm
/
engine
History
Nick Hill
dfeb2ecc3a
[Misc] Include matched stop string/token in responses (
#2976
)
...
Co-authored-by: Sahil Suneja <sahilsuneja@gmail.com>
2024-03-25 17:31:32 -07:00
..
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
arg_utils.py
[Feature] Add vision language model support. (
#3042
)
2024-03-25 14:16:30 -07:00
async_llm_engine.py
[Feature] Add vision language model support. (
#3042
)
2024-03-25 14:16:30 -07:00
llm_engine.py
[Misc] Include matched stop string/token in responses (
#2976
)
2024-03-25 17:31:32 -07:00
metrics.py
[CI] Try introducing isort. (
#3495
)
2024-03-25 07:59:47 -07:00
ray_utils.py
[CI] Try introducing isort. (
#3495
)
2024-03-25 07:59:47 -07:00