This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
564985729a
vllm
/
vllm
/
entrypoints
History
Robert Shaw
564985729a
[ BugFix ] Move
zmq
frontend to IPC instead of TCP (
#7222
)
2024-08-07 16:24:56 +00:00
..
openai
[ BugFix ] Move
zmq
frontend to IPC instead of TCP (
#7222
)
2024-08-07 16:24:56 +00:00
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
api_server.py
[BugFix] Overhaul async request cancellation (
#7111
)
2024-08-07 13:21:41 +08:00
chat_utils.py
[Frontend] Gracefully handle missing chat template and fix CI failure (
#7238
)
2024-08-07 09:12:05 +00:00
launcher.py
[Frontend] Reapply "Factor out code for running uvicorn" (
#7095
)
2024-08-04 20:40:51 -07:00
llm.py
[Core] Subclass ModelRunner to support cross-attention & encoder sequences (towards eventual encoder/decoder model support) (
#4942
)
2024-08-06 16:51:47 -04:00
logger.py
[Frontend] Refactor prompt processing (
#4028
)
2024-07-22 10:13:53 -07:00