Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add azure support <Feature request> #29

Open
amitrahav opened this issue Jul 10, 2023 · 5 comments
Open

Add azure support <Feature request> #29

amitrahav opened this issue Jul 10, 2023 · 5 comments

Comments

@amitrahav
Copy link

Please add azure openai support that will receive a different openai endpoint.

@Ran-Mewo
Copy link

just add a way to change the openai base url from the cli so we can use the azure endpoint or any proxy endpoint

@vicdotdevelop
Copy link

+1

@krrishdholakia
Copy link

I believe this should now be covered with the LiteLLM integration @amitrahav @Ran-Mewo @vicdotdevelop

response = completion(

https://docs.litellm.ai/docs/providers/azure

Happy to make a PR if there's anything you feel is missing

cc: @joshpxyne

@geekyme-fsmk
Copy link

@krrishdholakia so if I'm using Azure, do I still have to set the env var OPENAI_API_KEY?

Or would simply setting AZURE_API_KEY, AZURE_API_BASE, AZURE_API_VERSION work?

@geekyme-fsmk
Copy link

Without OPENAI_API_KEY, the command expects the env to be set.

When i set it to a dummy value, I get:

litellm.exceptions.APIConnectionError: Error code: 401 - {'error': {'message': 'Missing Authentication header', 'code': 401}}

I don't think Azure is supported with the current codebase sadly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants