This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
83c644fe7e
vllm
/
vllm
/
entrypoints
History
Yihuan Bu
654bc5ca49
Support for guided decoding for offline LLM (
#6878
)
...
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
2024-08-04 03:12:09 +00:00
..
openai
Support for guided decoding for offline LLM (
#6878
)
2024-08-04 03:12:09 +00:00
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
api_server.py
Revert "[Frontend] Factor out code for running uvicorn" (
#7012
)
2024-07-31 16:34:26 -07:00
chat_utils.py
[Frontend] Factor out chat message parsing (
#7055
)
2024-08-02 21:31:27 -07:00
llm.py
Support for guided decoding for offline LLM (
#6878
)
2024-08-04 03:12:09 +00:00
logger.py
[Frontend] Refactor prompt processing (
#4028
)
2024-07-22 10:13:53 -07:00