Skip to content

Issues: vllm-project/vllm

[Roadmap] vLLM Roadmap Q4 2024
#9006 opened Oct 1, 2024 by simon-mo
Open 26
vLLM's V1 Engine Architecture
#8779 opened Sep 24, 2024 by simon-mo
Open 10
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Installation]: XPU dependencies not built against most recent oneAPI installation Installation problems
#11734 opened Jan 4, 2025 by janimo
1 task done
[Usage]: serving 'LLaVA-Next-Video-7B-Qwen2' usage How to use vllm
#11731 opened Jan 4, 2025 by Noctis-SC
1 task done
[Bug]: PixtralHF inference broken since #11396 bug Something isn't working
#11726 opened Jan 3, 2025 by mgoin
1 task done
[New Model]: unsloth/Llama-3.3-70B-Instruct-bnb-4bit new model Requests to new models
#11725 opened Jan 3, 2025 by Hyfred
1 task done
[Bug]: Mismatch multi-modal placeholder of LLava-1.6-Mistral-7B bug Something isn't working
#11704 opened Jan 3, 2025 by jianghuyihei
1 task done
[Bug]: 0.6.6.post1 crash in marlin_utils.py bug Something isn't working
#11703 opened Jan 3, 2025 by Flynn-Zh
1 task done
[Bug]: vLLM LoRA Crash when using Dynamic Loading bug Something isn't working
#11702 opened Jan 3, 2025 by haitwang-cloud
1 task done
[Bug]: vLLM is erroneously sending some information outputs into the error stream bug Something isn't working
#11686 opened Jan 2, 2025 by mrakgr
1 task done
[Bug]: Error while importing vllm since v0.6.6 bug Something isn't working
#11683 opened Jan 2, 2025 by kkimmk
1 task done
[Usage]: Trying to add codeshell 7b model, but garbled characters usage How to use vllm
#11681 opened Jan 2, 2025 by G1017
1 task done
[Performance]: V1 vs V0 with multi-steps performance Performance-related issues
#11649 opened Dec 31, 2024 by Desmond819
1 task done
ProTip! Type g i on any issue or pull request to go back to the issue listing page.