-
Notifications
You must be signed in to change notification settings - Fork 624
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GitHub Actions fail when using Azure OpenAI's gpt-4 #1015
Comments
Hi @no-yan i am not sure how to address this. I thought maybe the error is something specific to Litellm with azure openai.
|
@mrT23 Thank you for your response! Working with LiteLLM v1.40.17
Yes, I’ve confirmed that CLI is working with LiteLLM v1.40.17. Specifically, I have confirmed the I used the following method to install it: $ pip install git+https://github.com/Codium-ai/pr-agent@main
$ pip list | grep -e litellm -e pr-agent
litellm 1.40.17
pr-agent 0.2.2 At the time of the initial report, I was using v1.31.10, which is the LiteLLM version used by PR Agent v0.2.2(latest release). Fine with OpenAI API
Yes. I tested it using OpenAI API] in the same verification repository, and it was successful. Here is my setting. id: pragent
uses: Codium-ai/pr-agent@main
env:
OPENAI_KEY: ${{ secrets.OPENAI_KEY_NOT_AZURE }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CONFIG.MODEL: "gpt-4-0613" CLI works with other commands
Yes, they worked without any issues. The |
Added requirements.txt to Issue description |
the error seems to be related |
@hariprasadiit, thanks for the feedback! @no-yan try this, and share if it helps note also that you can give the model specifically via a configuration file, and maybe that way litellm will be able to "digest" the data |
@hariprasadiit @mrT23 I can say it helped in my case :) |
@hariprasadiit @mrT23 Thank you for your feedback. I changed CLI to pass First, I changed the method of passing the API version to the PR Agent via an environment variable. Changes MadeFailed Method:// fail❌
+ os.environ['OPENAI.API_VERSION'] = '2023-05-15'
- api_version = '2023-05-15'
- get_settings().set("openai.api_version", api_versoin) Successful Method:However, changing it to the following method succeeded: // Success🟢
- os.environ['OPENAI.API_VERSION'] = '2023-05-15'
+ os.environ['OPENAI_API_VERSION'] = '2023-05-15' Similarly, I confirmed that the following code works in GitHub Actions. uses: Codium-ai/pr-agent@main
env:
OPENAI_KEY: ${{ secrets.OPENAI_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CONFIG.MODEL: "gpt-4-0613"
OPENAI.API_TYPE: "azure"
- OPENAI.API_VERSION: "2023-05-15"
+ OPENAI_API_VERSION: "2023-05-15"
OPENAI.API_BASE: ${{ vars.API_ENDPOINT }}
OPENAI.DEPLOYMENT_ID: ${{ vars.DEPLOYMENT_ID }}
+ AZURE_API_VERSION: "2023-05-15" And adding the NoteI suspect there are two potential causes for this issue:
The first cause seems to have been brought about by #989. This was merged one day before we encountered these errors. |
@no-yan are you able to get it working for Update: I got it working. was using wrong values for api version |
@hariprasadiit are you able to share your config. I'm still getting a 404 with the following config (using gpt-4o): on:
pull_request:
types: [opened, reopened, ready_for_review]
issue_comment:
jobs:
pr_agent_job:
if: ${{ github.event.sender.type != 'Bot' }}
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
contents: write
name: Run PR Agent on every pull request, respond to user comments
steps:
- name: PR Agent action step
id: pragent
uses: Codium-ai/pr-agent@main
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_ACTION_CONFIG_AUTO_REVIEW: 'true'
GITHUB_ACTION_CONFIG_AUTO_DESCRIBE: 'true'
GITHUB_ACTION_CONFIG_AUTO_IMPROVE: 'true'
OPENAI_KEY: ${{ secrets.OPENAI_KEY }}
OPENAI_API_TYPE: 'azure'
OPENAI_API_BASE: '********'
OPENAI_DEPLOYMENT_ID: '********'
OPENAI_API_VERSION: '2024-06-01'
AZURE_API_VERSION: '2024-06-01' |
@mark-hingston I'm using API version
|
@hariprasadiit When I pass in the env with just the
It seems to be trying to use the OpenAI API instead of the Azure OpenAI instance endpoint. After setting the variables up I've tried using:
and:
Both seem to be trying to connect to OpenAI and not Azure OpenAI. |
@mark-hingston I kind of included all possible env vars and formats which is working. didn't test which combination working
|
@hariprasadiit Awesome thanks, that worked for me. |
This is what is happening, the settings object is automatically parsing the value as a datetime. I haven't dug into why that is the case, but that appears to be what is happening here. When the LiteLLMAIHandler sets the api_version, it is already a datetime and litellm is expecting a string.
|
Agreed, adding the env variables |
Problem
GitHub Actions should be executed when the pull request is created, but they fail and leave a comment "Failed to generate code suggestions for PR".
This error first occurred on 6/22 and has been occurring every time since then.
No changes were made to the code or model before the error occurred.
Environments
Steps to reproduce:
.github/workflows/action.yaml
and set secretsOPENAI_KEY
Expected behavior
Do Review, Describe, Suggest.
Actual behavior
GitHub Error messages are as follows:
Comment on GitHub
Failed to generate code suggestions for PR
Stack trace
LiteLLM:ERROR: main.py:399 - litellm.acompletion(): Exception occured - litellm.APIConnectionError: 'datetime.date' object has no attribute 'split'
Details
Additional information
This workflow was previously functioning without any issues, and there were no changes made to the code or model before the errors began occurring. The same error has occurred simultaneously across multiple repositories and multiple models.
The error persists even after downgrading the actions to the latest release (v0.22) or v0.2.
When I run the same configuration values in a Python script in a local environment, it operates without any problems. I am using the same API Key.
CLI Program and requirements.txt
litellm.set_verbose=True
from GitHub Actions. I would appreciate it if the developers could provide additional information or measures on setting this log level, if possible.The text was updated successfully, but these errors were encountered: