vllm/tests/entrypoints
tomeras91 ac04a97a9f
[Frontend] Add max_tokens prometheus metric (#9881)
Signed-off-by: Tomer Asida <tomera@ai21.com>
2024-11-04 22:53:24 +00:00
..
llm [Bugfix][Frontend] Guard against bad token ids (#9634) 2024-10-29 14:13:20 -07:00
offline_mode [Bugfix] Fix offline mode when using mistral_common (#9457) 2024-10-18 18:12:32 -07:00
openai [Frontend] Add max_tokens prometheus metric (#9881) 2024-11-04 22:53:24 +00:00
__init__.py [CI/Build] Move test_utils.py to tests/utils.py (#4425) 2024-05-13 23:50:09 +09:00
conftest.py Support for guided decoding for offline LLM (#6878) 2024-08-04 03:12:09 +00:00
test_chat_utils.py [Bugfix]: Make chat content text allow type content (#9358) 2024-10-24 05:05:49 +00:00