This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
03dd7d52bf
vllm
/
vllm
/
entrypoints
History
Roy
7134303cbb
[Bugfix][Core] Fix get decoding config from ray (
#4335
)
2024-04-27 11:30:08 +00:00
..
openai
[Bugfix][Core] Fix get decoding config from ray (
#4335
)
2024-04-27 11:30:08 +00:00
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
api_server.py
[Frontend] Add --log-level option to api server (
#4377
)
2024-04-26 05:36:01 +00:00
llm.py
Make initialization of tokenizer and detokenizer optional (
#3748
)
2024-04-21 22:06:46 +00:00