You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I configure a model in Azure OpenAI like the following it works fine:
- model_name: 'openai.gpt-4o'litellm_params:
model: 'azure/gpt-4o-2024-11-20'# 👈 Notice the - instead of . before versionapi_base: 'https://my-endpoin.openai.azure.com/'api_version: '2024-12-01-preview'api_key: 'my-secret'
However, if I name my model with a . in the model name, pricing and other model info is not automatically discovered by LiteLLM. The UI shows blank / null values for everything and is unable to track usage/cost accurately
- model_name: 'openai.gpt-4o'litellm_params:
model: 'azure/gpt-4o.2024-11-20'# 👈 Notice the . instead of - before versionapi_base: 'https://my-endpoin.openai.azure.com/'api_version: '2024-12-01-preview'api_key: 'my-secret'
I even tried setting base_model and that doesn't fix it when there is a . in the model name
EDIT:
I also notice an issue with o1 model in Azure. If I name my model deployment in Azure anything but o1 the model details and pricing are not recognized by LiteLLM proxy config. For example model: azure/o1 works but model: azure/o1-2024-12-17 does not work.
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.57.3
1.59.1 (for azure/o1 support)
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
If I configure a model in Azure OpenAI like the following it works fine:
However, if I name my model with a . in the model name, pricing and other model info is not automatically discovered by LiteLLM. The UI shows blank / null values for everything and is unable to track usage/cost accurately
I even tried setting
base_model
and that doesn't fix it when there is a . in the model nameEDIT:
I also notice an issue with o1 model in Azure. If I name my model deployment in Azure anything but
o1
the model details and pricing are not recognized by LiteLLM proxy config. For examplemodel: azure/o1
works butmodel: azure/o1-2024-12-17
does not work.Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.57.3
1.59.1 (for azure/o1 support)
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: