vllm/tests/entrypoints
Robert Shaw 6ace6fba2c
[V1] AsyncLLM Implementation (#9826)
Signed-off-by: Nick Hill <nickhill@us.ibm.com>
Signed-off-by: rshaw@neuralmagic.com <rshaw@neuralmagic.com>
Signed-off-by: Nick Hill <nhill@redhat.com>
Co-authored-by: Nick Hill <nickhill@us.ibm.com>
Co-authored-by: Varun Sundar Rabindranath <varun@neuralmagic.com>
Co-authored-by: Nick Hill <nhill@redhat.com>
Co-authored-by: Tyler Michael Smith <tyler@neuralmagic.com>
2024-11-11 23:05:38 +00:00
..
llm [V1] AsyncLLM Implementation (#9826) 2024-11-11 23:05:38 +00:00
offline_mode [Bugfix] Fix offline mode when using mistral_common (#9457) 2024-10-18 18:12:32 -07:00
openai [V1] AsyncLLM Implementation (#9826) 2024-11-11 23:05:38 +00:00
__init__.py [CI/Build] Move test_utils.py to tests/utils.py (#4425) 2024-05-13 23:50:09 +09:00
conftest.py Support for guided decoding for offline LLM (#6878) 2024-08-04 03:12:09 +00:00
test_chat_utils.py [Bugfix]: Make chat content text allow type content (#9358) 2024-10-24 05:05:49 +00:00