[Doc] Add API reference for offline inference (#4710)
This commit is contained in:
parent
ac1fbf7fd2
commit
4bfa7e7f75
@ -67,6 +67,13 @@ Documentation
|
|||||||
getting_started/quickstart
|
getting_started/quickstart
|
||||||
getting_started/examples/examples_index
|
getting_started/examples/examples_index
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:caption: Offline Inference
|
||||||
|
|
||||||
|
offline_inference/llm
|
||||||
|
offline_inference/sampling_params
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
:caption: Serving
|
:caption: Serving
|
||||||
@ -101,7 +108,6 @@ Documentation
|
|||||||
:maxdepth: 2
|
:maxdepth: 2
|
||||||
:caption: Developer Documentation
|
:caption: Developer Documentation
|
||||||
|
|
||||||
dev/sampling_params
|
|
||||||
dev/engine/engine_index
|
dev/engine/engine_index
|
||||||
dev/kernel/paged_attention
|
dev/kernel/paged_attention
|
||||||
dev/dockerfile/dockerfile
|
dev/dockerfile/dockerfile
|
||||||
|
|||||||
6
docs/source/offline_inference/llm.rst
Normal file
6
docs/source/offline_inference/llm.rst
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
LLM Class
|
||||||
|
==========
|
||||||
|
|
||||||
|
.. autoclass:: vllm.LLM
|
||||||
|
:members:
|
||||||
|
:show-inheritance:
|
||||||
@ -1,5 +1,5 @@
|
|||||||
Sampling Params
|
Sampling Parameters
|
||||||
===============
|
===================
|
||||||
|
|
||||||
.. autoclass:: vllm.SamplingParams
|
.. autoclass:: vllm.SamplingParams
|
||||||
:members:
|
:members:
|
||||||
@ -48,7 +48,7 @@ completion = client.chat.completions.create(
|
|||||||
```
|
```
|
||||||
|
|
||||||
### Extra Parameters for Chat API
|
### Extra Parameters for Chat API
|
||||||
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
|
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.
|
||||||
|
|
||||||
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
|
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
|
||||||
:language: python
|
:language: python
|
||||||
@ -65,7 +65,7 @@ The following extra parameters are supported:
|
|||||||
```
|
```
|
||||||
|
|
||||||
### Extra Parameters for Completions API
|
### Extra Parameters for Completions API
|
||||||
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
|
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.
|
||||||
|
|
||||||
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
|
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
|
||||||
:language: python
|
:language: python
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user