vllm/vllm/entrypoints
dancingpipi 51cd22ce56
set&get llm internal tokenizer instead of the TokenizerGroup (#2741)
Co-authored-by: shujunhua1 <shujunhua1@jd.com>
2024-02-04 14:25:36 -08:00
..
openai fix python 3.8 syntax (#2716) 2024-02-01 14:00:58 -08:00
__init__.py Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
api_server.py [Experimental] Prefix Caching Support (#1669) 2024-01-17 16:32:10 -08:00
llm.py set&get llm internal tokenizer instead of the TokenizerGroup (#2741) 2024-02-04 14:25:36 -08:00