[Doc] Add API reference for offline inference (#4710)

This commit is contained in:
Cyrus Leung 2024-05-14 08:47:42 +08:00 committed by GitHub
parent ac1fbf7fd2
commit 4bfa7e7f75
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
4 changed files with 17 additions and 5 deletions

View File

@ -67,6 +67,13 @@ Documentation
getting_started/quickstart getting_started/quickstart
getting_started/examples/examples_index getting_started/examples/examples_index
.. toctree::
:maxdepth: 1
:caption: Offline Inference
offline_inference/llm
offline_inference/sampling_params
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
:caption: Serving :caption: Serving
@ -101,7 +108,6 @@ Documentation
:maxdepth: 2 :maxdepth: 2
:caption: Developer Documentation :caption: Developer Documentation
dev/sampling_params
dev/engine/engine_index dev/engine/engine_index
dev/kernel/paged_attention dev/kernel/paged_attention
dev/dockerfile/dockerfile dev/dockerfile/dockerfile

View File

@ -0,0 +1,6 @@
LLM Class
==========
.. autoclass:: vllm.LLM
:members:
:show-inheritance:

View File

@ -1,5 +1,5 @@
Sampling Params Sampling Parameters
=============== ===================
.. autoclass:: vllm.SamplingParams .. autoclass:: vllm.SamplingParams
:members: :members:

View File

@ -48,7 +48,7 @@ completion = client.chat.completions.create(
``` ```
### Extra Parameters for Chat API ### Extra Parameters for Chat API
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported. The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py ```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
:language: python :language: python
@ -65,7 +65,7 @@ The following extra parameters are supported:
``` ```
### Extra Parameters for Completions API ### Extra Parameters for Completions API
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported. The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py ```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
:language: python :language: python