-
Notifications
You must be signed in to change notification settings - Fork 595
Bug fixing vllm_openai model initialization with duplicate model fields #1609
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Conversation
Fix vllm_openai model initialization to allow identical `model` and `parameters.model_name`. Other components automatically add both fields even if the config.yml specifies it once; this ensures validation passes while still catching conflicts. Signed-off-by: rttz159 <[email protected]>
Greptile OverviewGreptile SummaryThis PR fixes vllm_openai model initialization by allowing identical values in both top-level Key Changes:
Impact:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 file reviewed, 1 comment
| if model_field.strip() != model_from_params.strip(): | ||
| raise InvalidModelConfigurationError( | ||
| "Conflicting model names: `model` and `parameters.model/model_name` must match if both are provided." | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
case sensitivity not handled - model and Model would be treated as conflicting values even though model names are often case-insensitive
Prompt To Fix With AI
This is a comment left during a code review.
Path: nemoguardrails/rails/llm/config.py
Line: 143:146
Comment:
case sensitivity not handled - `model` and `Model` would be treated as conflicting values even though model names are often case-insensitive
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the comment! In this case, most of the models hosted by vLLM are case-sensitive, so Iintentionally preserve case. Conflicts with differing case shouldn’t occur for these models, so no change is needed.
Fix vllm_openai model initialization to allow identical
modelandparameters.model_name.Other components automatically add both fields even if the config.yml specifies it once; this ensures validation passes while still catching conflicts.
Description
Previously, initializing the
vllm_openaimodel would fail if both a top-levelmodelfield andparameters.model/parameters.model_namewere present, even whenthey were identical. This caused issues because other components automatically
inject the model at the top level, even if the original config.yml only specifies
it once under
parameters.This change updates the
Modelvalidator to:modelandparameters.model/model_name.parameters.This ensures smooth initialization of
vllm_openaimodels while maintaining a singlesource of truth for the model name.
Related Issue(s)
Checklist