-
-
Notifications
You must be signed in to change notification settings - Fork 4.8k
Description
What happened?
Summary:
In LiteLLM v1.77.3, when making image generation requests to Azure OpenAI using the gpt-image-1 model and specifying parameters such as response_format for managed identity authentication, the library incorrectly injects an unsupported extra_body field into the request payload. This results in a 400 Bad Request error from the Azure endpoint.
Issue Details:
When making requests to the gpt-image-1 model via LiteLLM, if unsupported parameters (e.g., response_format) are included in the payload, LiteLLM currently adds these parameters to the extra_body field. Since the model does not support these parameters, this behavior results in a 400 Bad Request error from the endpoint.
Suggested Fix:
When making requests to a specific model, even if unsupported parameters are included in the payload, LiteLLM should automatically filter out those parameters rather than passing them into the extra_body field. Injecting unsupported fields leads to request failures (e.g., 400 Bad Request), especially when interacting with strict endpoints like Azure OpenAI.
Relevant log output
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://xxxxxx.openai.azure.com/openai/deployments/gpt-image-1/images/generations?api-version=2025-03-01-preview'
POST Request Sent from LiteLLM:
curl -X POST \
https://xxxxx.openai.azure.com/openai/deployments/gpt-image-1/images/generations?api-version=2025-03-01-preview \
-H 'Content-Type: application/json' -H 'Authorization: Be****Pw' \
-d '{'model': 'gpt-image-1', 'prompt': 'A small beautiful sunset over a mountain landscape', 'extra_body': {'response_format': 'b64_json'}}'Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v.1.77.3