vllm/docs/source
Kaunil Dhruv 058344f89a
[Frontend]-config-cli-args (#7737)
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Co-authored-by: Kaunil Dhruv <kaunil_dhruv@intuit.com>
2024-08-30 08:21:02 -07:00
..
_static [Docs] Add RunLLM chat widget (#6857) 2024-07-27 09:24:46 -07:00
_templates/sections [Doc] Guide for adding multi-modal plugins (#6205) 2024-07-10 14:55:34 +08:00
assets [Doc] add visualization for multi-stage dockerfile (#4456) 2024-04-30 17:41:59 +00:00
automatic_prefix_caching [Doc] Add an automatic prefix caching section in vllm documentation (#5324) 2024-06-11 10:24:59 -07:00
community Add Skywork AI as Sponsor (#7314) 2024-08-08 13:59:57 -07:00
dev [Core][VLM] Stack multimodal tensors to represent multiple images within each prompt (#7902) 2024-08-28 01:53:56 +00:00
getting_started [TPU] Upgrade PyTorch XLA nightly (#7967) 2024-08-28 13:10:21 -07:00
models [MODEL] add Exaone model support (#7819) 2024-08-29 23:34:20 -07:00
performance_benchmark [Doc] fix 404 link (#7966) 2024-08-28 13:54:23 -07:00
quantization [Doc] fix the autoAWQ example (#7937) 2024-08-28 12:12:32 +00:00
serving [Frontend]-config-cli-args (#7737) 2024-08-30 08:21:02 -07:00
conf.py [Bugfix][Docs] Update list of mock imports (#7493) 2024-08-13 20:37:30 -07:00
generate_examples.py Add example scripts to documentation (#4225) 2024-04-22 16:36:54 +00:00
index.rst [misc] Add Torch profiler support (#7451) 2024-08-21 15:39:26 -07:00