Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Model Access Restrictions Not Enforced via URL Model Name #8780

Open
ltahilra opened this issue Feb 25, 2025 · 2 comments
Open

[Bug]: Model Access Restrictions Not Enforced via URL Model Name #8780

ltahilra opened this issue Feb 25, 2025 · 2 comments
Labels
bug Something isn't working mlops user request

Comments

@ltahilra
Copy link

ltahilra commented Feb 25, 2025

Description

When using model access groups to restrict access to specific models, users who do not have access to a given model should not be able to make requests to it. However, it appears that model access restrictions are only enforced when the model name is explicitly included in the request payload, and not when it is specified via the URL.

Steps to Reproduce

  1. Set up a model access group restricting access to a specific model (e.g., o3-mini-2025-01-31).

  2. Attempt to call the model via a URL-based request (without specifying "model" in the payload):

    curl -sSL 'http://0.0.0.0:4000/openai/deployments/o3-mini-2025-01-31/chat/completions?api-version=2024-10-21' \
      --header "Authorization: Bearer ${litellm_key}" \
      --header 'Content-Type: application/json' \
      --data-raw '{
        "messages": [
          { "role": "user", "content": "What LLM are you?" }
        ]
      }' | jq '.'
    • Expected: The request should be blocked if the user does not have access to o3-mini-2025-01-31.
    • Actual: The request goes through, bypassing access restrictions.
  3. Attempt the same request but include the model name in the payload:

    curl -sSL 'http://0.0.0.0:4000/chat/completions?api-version=2024-10-21' \
    --header "Authorization: Bearer ${litellm_key}" \
    --header 'Content-Type: application/json' \
    --data-raw '{
    "model": "o3-mini-2025-01-31",
    "messages": [
    { "role": "user", "content": "What LLM are you?" }
    ]
    }' | jq '.'
    
    • Expected: The request should be blocked if the user does not have access.
    • Actual: The correct "restricted access" error appears.
      {
  "error": {
    "message": "API Key not allowed to access model. This token can only access models=['default-models']. Tried to access gpt-o3-mini",
    "type": "key_model_access_denied",
    "param": "model",
    "code": "401"
  }
}

Issue

It seems that LiteLLM only checks model access restrictions when the model is explicitly provided in the request payload, but not when it is embedded in the URL. This allows unauthorized users to access restricted models by using the deployment-style URL format.

Expected Behavior

Model access restrictions should be enforced regardless of whether the model name is provided in the request URL or in the payload.

Relevant log output

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.60.5

Twitter / LinkedIn details

No response

@ltahilra ltahilra added the bug Something isn't working label Feb 25, 2025
@krrishdholakia
Copy link
Contributor

Acknowledging this - this seems to happen on the azure openai route, correct? @ltahilra

@ltahilra
Copy link
Author

@krrishdholakia Yes, this is occurring on the Azure OpenAI route. It seems like model access restrictions are only enforced when the model name is provided in the request payload but not when it's included in the URL. Let me know if you need any additional details!

Acknowledging this - this seems to happen on the azure openai route, correct? @ltahilra

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

No branches or pull requests

2 participants