You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using model access groups to restrict access to specific models, users who do not have access to a given model should not be able to make requests to it. However, it appears that model access restrictions are only enforced when the model name is explicitly included in the request payload, and not when it is specified via the URL.
Steps to Reproduce
Set up a model access group restricting access to a specific model (e.g., o3-mini-2025-01-31).
Attempt to call the model via a URL-based request (without specifying "model" in the payload):
Expected: The request should be blocked if the user does not have access.
Actual: The correct "restricted access" error appears.
{
"error": {
"message": "API Key not allowed to access model. This token can only access models=['default-models']. Tried to access gpt-o3-mini",
"type": "key_model_access_denied",
"param": "model",
"code": "401"
}
}
Issue
It seems that LiteLLM only checks model access restrictions when the model is explicitly provided in the request payload, but not when it is embedded in the URL. This allows unauthorized users to access restricted models by using the deployment-style URL format.
Expected Behavior
Model access restrictions should be enforced regardless of whether the model name is provided in the request URL or in the payload.
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.60.5
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
@krrishdholakia Yes, this is occurring on the Azure OpenAI route. It seems like model access restrictions are only enforced when the model name is provided in the request payload but not when it's included in the URL. Let me know if you need any additional details!
Acknowledging this - this seems to happen on the azure openai route, correct? @ltahilra
Description
When using model access groups to restrict access to specific models, users who do not have access to a given model should not be able to make requests to it. However, it appears that model access restrictions are only enforced when the model name is explicitly included in the request payload, and not when it is specified via the URL.
Steps to Reproduce
Set up a model access group restricting access to a specific model (e.g.,
o3-mini-2025-01-31
).Attempt to call the model via a URL-based request (without specifying
"model"
in the payload):o3-mini-2025-01-31
.Attempt the same request but include the model name in the payload:
Issue
It seems that LiteLLM only checks model access restrictions when the model is explicitly provided in the request payload, but not when it is embedded in the URL. This allows unauthorized users to access restricted models by using the deployment-style URL format.
Expected Behavior
Model access restrictions should be enforced regardless of whether the model name is provided in the request URL or in the payload.
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.60.5
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: