This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
6d792d2f31
vllm
/
tests
/
models
History
Isotr0py
6d792d2f31
[Bugfix][VLM] Fix Fuyu batching inference with
max_num_seqs>1
(
#8892
)
2024-09-27 01:15:58 -07:00
..
decoder_only
[Bugfix][VLM] Fix Fuyu batching inference with
max_num_seqs>1
(
#8892
)
2024-09-27 01:15:58 -07:00
embedding
[CI/Build] Reorganize models tests (
#7820
)
2024-09-13 10:20:06 -07:00
encoder_decoder
[Model] Add support for the multi-modal Llama 3.2 model (
#8811
)
2024-09-25 13:29:32 -07:00
fixtures
[CI/Build] Update pixtral tests to use JSON (
#8436
)
2024-09-13 03:47:52 +00:00
__init__.py
[CI/Build] Move
test_utils.py
to
tests/utils.py
(
#4425
)
2024-05-13 23:50:09 +09:00
test_oot_registration.py
[misc][ci] fix cpu test with plugins (
#7489
)
2024-08-13 19:27:46 -07:00
test_registry.py
[BugFix] Fix test breakages from transformers 4.45 upgrade (
#8829
)
2024-09-26 16:46:43 -07:00
utils.py
[Core][Frontend] Support Passing Multimodal Processor Kwargs (
#8657
)
2024-09-23 07:44:48 +00:00