This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
e23a43aef8
vllm
/
vllm
/
entrypoints
History
Thomas Parnell
1d7c940d74
Add option to completion API to truncate prompt tokens (
#3144
)
2024-04-05 10:15:42 -07:00
..
openai
Add option to completion API to truncate prompt tokens (
#3144
)
2024-04-05 10:15:42 -07:00
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
api_server.py
Usage Stats Collection (
#2852
)
2024-03-28 22:16:12 -07:00
llm.py
Usage Stats Collection (
#2852
)
2024-03-28 22:16:12 -07:00