vllm/tests/entrypoints
2024-11-21 16:24:32 +00:00
..
llm [V1] AsyncLLM Implementation (#9826) 2024-11-11 23:05:38 +00:00
offline_mode [Bugfix] Fix offline mode when using mistral_common (#9457) 2024-10-18 18:12:32 -07:00
openai [Bug]: When apply continue_final_message for OpenAI server, the "echo":false is ignored (#10180) 2024-11-21 16:24:32 +00:00
__init__.py [CI/Build] Move test_utils.py to tests/utils.py (#4425) 2024-05-13 23:50:09 +09:00
conftest.py Support for guided decoding for offline LLM (#6878) 2024-08-04 03:12:09 +00:00
test_chat_utils.py [Frontend] Automatic detection of chat content format from AST (#9919) 2024-11-16 13:35:40 +08:00