Skip to content

Conversation

@rttz159
Copy link

@rttz159 rttz159 commented Jan 31, 2026

Fix vllm_openai model initialization to allow identical model and parameters.model_name.

Other components automatically add both fields even if the config.yml specifies it once; this ensures validation passes while still catching conflicts.

Description

Previously, initializing the vllm_openai model would fail if both a top-level
model field and parameters.model/parameters.model_name were present, even when
they were identical. This caused issues because other components automatically
inject the model at the top level, even if the original config.yml only specifies
it once under parameters.

This change updates the Model validator to:

  • Allow identical values in both top-level model and parameters.model/model_name.
  • Normalize the configuration by removing the duplicate from parameters.
  • Continue to raise an error if the two values differ, preventing real conflicts.

This ensures smooth initialization of vllm_openai models while maintaining a single
source of truth for the model name.

Related Issue(s)

Checklist

  • I've read the CONTRIBUTING guidelines.
  • I've updated the documentation if applicable.
  • I've added tests if applicable.
  • @mentions of the person or team responsible for reviewing proposed changes.

Fix vllm_openai model initialization to allow identical `model` and `parameters.model_name`.

Other components automatically add both fields even if the config.yml specifies it once; this ensures validation passes while still catching conflicts.


Signed-off-by: rttz159 <[email protected]>
@greptile-apps
Copy link
Contributor

greptile-apps bot commented Jan 31, 2026

Greptile Overview

Greptile Summary

This PR fixes vllm_openai model initialization by allowing identical values in both top-level model and parameters.model/parameters.model_name fields. Previously, having both fields would raise an error even when values matched.

Key Changes:

  • Modified Model.set_and_validate_model() to compare values when both fields are present
  • If values match (after stripping whitespace), duplicates are removed from parameters
  • If values differ, the original error is raised with updated message
  • Normalizes configuration to single source of truth while allowing components to inject both fields

Impact:

  • Fixes initialization failures for vllm_openai models when other components automatically inject the model field at top level
  • Maintains validation to prevent actual conflicts when values differ
  • Backward compatible - only relaxes validation for identical values

Confidence Score: 4/5

  • Safe to merge with minor improvements suggested
  • The change is well-scoped and solves a real initialization issue. The logic is straightforward - comparing values and removing duplicates when they match. However, the PR lacks tests for the new behavior and could benefit from case-insensitive comparison.
  • No files require special attention - single focused change to validation logic

Sequence Diagram

sequenceDiagram
    participant User
    participant Config as RailsConfig
    participant Validator as Model.set_and_validate_model
    participant Model as Model Instance

    User->>Config: Initialize with config.yml
    Config->>Validator: Call validator with model data
    
    alt Both model and parameters.model present
        Validator->>Validator: Check if model == parameters.model (strip whitespace)
        alt Values match
            Validator->>Validator: Remove model_name from parameters
            Validator->>Validator: Remove model from parameters
            Validator->>Config: Return normalized data
        else Values differ
            Validator-->>Config: Raise InvalidModelConfigurationError
        end
    else Only parameters.model present
        Validator->>Validator: Copy parameters.model to top-level model
        Validator->>Validator: Remove duplicate from parameters
        Validator->>Config: Return normalized data
    else Only top-level model present
        Validator->>Config: Return data as-is
    end
    
    Config->>Model: Create Model instance with normalized data
    Model-->>User: Return initialized model
Loading

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 file reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

Comment on lines +143 to +146
if model_field.strip() != model_from_params.strip():
raise InvalidModelConfigurationError(
"Conflicting model names: `model` and `parameters.model/model_name` must match if both are provided."
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

case sensitivity not handled - model and Model would be treated as conflicting values even though model names are often case-insensitive

Prompt To Fix With AI
This is a comment left during a code review.
Path: nemoguardrails/rails/llm/config.py
Line: 143:146

Comment:
case sensitivity not handled - `model` and `Model` would be treated as conflicting values even though model names are often case-insensitive

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the comment! In this case, most of the models hosted by vLLM are case-sensitive, so Iintentionally preserve case. Conflicts with differing case shouldn’t occur for these models, so no change is needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant