vllm/requirements.txt

13 lines
283 B
Plaintext

ninja # For faster builds.
psutil
ray
sentencepiece # Required for LLaMA tokenizer.
numpy
torch >= 2.0.0
transformers >= 4.28.0 # Required for LLaMA.
xformers >= 0.0.19
fastapi
uvicorn
pydantic # Required for OpenAI server.
fschat # Required for OpenAI ChatCompletion Endpoint.