This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
132
Commits
1
Branch
0
Tags
24
MiB
6208d622ca
Commit Graph
2 Commits
Author
SHA1
Message
Date
Woosuk Kwon
c9d5b6d4a8
Replace FlashAttention with xformers (
#70
)
2023-05-05 02:01:08 -07:00
Woosuk Kwon
09e9245478
Add custom kernel for RMS normalization (
#16
)
2023-04-01 00:51:22 +08:00