vllm/requirements.txt

13 lines
296 B
Plaintext
Raw Normal View History

ninja # For faster builds.
psutil
2023-07-20 13:49:31 +08:00
ray >= 2.5.1
sentencepiece # Required for LLaMA tokenizer.
numpy
torch >= 2.0.0
transformers >= 4.28.0 # Required for LLaMA.
xformers >= 0.0.19
fastapi
uvicorn
2023-07-12 23:10:55 +08:00
pydantic < 2 # Required for OpenAI server.
fschat # Required for OpenAI ChatCompletion Endpoint.