This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
3bb4b1e4cd
vllm
/
.buildkite
/
lm-eval-harness
History
HandH1998
6512937de1
Support W4A8 quantization for vllm (
#5218
)
2024-07-31 07:55:21 -06:00
..
configs
Support W4A8 quantization for vllm (
#5218
)
2024-07-31 07:55:21 -06:00
run-lm-eval-gsm-hf-baseline.sh
[ CI/Build ] LM Eval Harness Based CI Testing (
#5838
)
2024-06-29 13:04:30 -04:00
run-lm-eval-gsm-vllm-baseline.sh
[ Misc ]
fbgemm
checkpoints (
#6559
)
2024-07-20 09:36:57 -07:00
run-tests.sh
[ CI/Build ] LM Eval Harness Based CI Testing (
#5838
)
2024-06-29 13:04:30 -04:00
test_lm_eval_correctness.py
[ Misc ] Support Fp8 via
llm-compressor
(
#6110
)
2024-07-07 20:42:11 +00:00