diff --git a/README.md b/README.md index 13c654a2..8ea4d029 100644 --- a/README.md +++ b/README.md @@ -27,7 +27,7 @@ Easy, fast, and cheap LLM serving for everyone - [2023/06] We officially released vLLM! FastChat-vLLM integration has powered [LMSYS Vicuna and Chatbot Arena](https://chat.lmsys.org) since mid-April. Check out our [blog post](https://vllm.ai). --- - +## About vLLM is a fast and easy-to-use library for LLM inference and serving. vLLM is fast with: