-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when closing AzureAIChatCompletionClient upon a validation error #5414
Comments
Your model family is incorrect. That value error led to another exception caused by the closing of the client not finding an instance of the underlying azure ai inference client. Update your code to this: import asyncio
import os
from azure.core.credentials import AzureKeyCredential
from autogen_ext.models.azure import AzureAIChatCompletionClient
from autogen_core.models import UserMessage, ModelFamily
async def main():
client = AzureAIChatCompletionClient(
model="gpt-4o",
endpoint="https://models.inference.ai.azure.com",
credential=AzureKeyCredential(os.environ["GITHUB_TOKEN"]),
model_info={
"json_output": True,
"function_calling": True,
"vision": True,
"family": ModelFamily.GPT_4O,
},
)
result = await client.create([UserMessage(content="What is the capital of France?", source="user")])
print(result)
if __name__ == "__main__":
asyncio.run(main()) |
Keep this issue open, we will need to update the handling of the closing error. |
Ah okay, I was guessing the family as the example given had "unknown", a string. Might help if the example code had a ModelFamily. |
@pamelafox if you use Python extension in VSCode it's easier with auto suggestions from valid family values. |
What happened?
I'm attempting to use AzureAIChatCompletionClient with the travel_planning example, with this client:
However, I get this error:
What did you expect to happen?
Success
How can we reproduce it (as minimally and precisely as possible)?
Open travel_planning.ipynb, use that client instead
AutoGen version
latest (editable install from Codespaces)
Which package was this bug in
Extensions
Model used
gpt-4o, GitHub models
Python version
3.12
Operating system
Linux
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: