This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
e491c7e053
vllm
/
docs
/
source
History
Frαnçois
e491c7e053
[Doc] update(example model): for OpenAI compatible serving (
#4503
)
2024-05-01 10:14:16 -07:00
..
assets
[Doc] add visualization for multi-stage dockerfile (
#4456
)
2024-04-30 17:41:59 +00:00
dev
[Doc] add visualization for multi-stage dockerfile (
#4456
)
2024-04-30 17:41:59 +00:00
getting_started
Unable to find Punica extension issue during source code installation (
#4494
)
2024-05-01 00:42:09 +00:00
models
[Bugfix][Model] Refactor OLMo model to support new HF format in transformers 4.40.0 (
#4324
)
2024-04-25 09:35:56 -07:00
quantization
Enable scaled FP8 (e4m3fn) KV cache on ROCm (AMD GPU) (
#3290
)
2024-04-03 14:15:55 -07:00
serving
[Doc] update(example model): for OpenAI compatible serving (
#4503
)
2024-05-01 10:14:16 -07:00
conf.py
[CI] Disable non-lazy string operation on logging (
#4326
)
2024-04-26 00:16:58 -07:00
generate_examples.py
Add example scripts to documentation (
#4225
)
2024-04-22 16:36:54 +00:00
index.rst
[Doc] add visualization for multi-stage dockerfile (
#4456
)
2024-04-30 17:41:59 +00:00