-
Notifications
You must be signed in to change notification settings - Fork 305
Conversation
WalkthroughThis pull request primarily refactors the interaction with the OpenAI API in the Changes
Uplevel your code reviews with CodeRabbit ProCodeRabbit ProIf you like this project, please support us by purchasing the Pro version. The Pro version has advanced context, superior noise reduction and several proprietary improvements compared to the open source version. Moreover, CodeRabbit Pro is free for open source projects. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (8)
- .github/workflows/openai-review.yml (3)
- package-lock.json (82)
- package.json (1)
- src/bot.ts (4)
- src/main.ts (1)
- src/options.ts (4)
- src/review-comment.ts (1)
- src/review.ts (5)
Files not summarized due to errors (1)
- package-lock.json (diff tokens exceeds limit)
Files skipped from review due to trivial changes (5)
- .github/workflows/openai-review.yml
- package.json
- src/options.ts
- src/review-comment.ts
- src/review.ts
Review comments generated (15)
- Review: 15
- LGTM: 0
Tips
Chat with CodeRabbit Bot (@coderabbitai
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@coderabbitai
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@coderabbitai: ignore
anywhere in the PR description to pause further reviews from the bot.
getInput('openai_timeout_ms'), | ||
getInput('openai_concurrency_limit'), | ||
getInput('github_concurrency_limit'), | ||
getInput('openai_base_url'), | ||
getInput('language') | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The removal of the openai_base_url
input seems to be in line with the changes made to the Bot
class and its interaction with the OpenAI API. However, please ensure that this change doesn't affect other parts of the codebase that might still be using this input. If there are such dependencies, they should be updated accordingly.
- getInput('openai_base_url'),
@@ -15,6 +15,7 @@ | |||
"@octokit/action": "^6.0.4", | |||
"@octokit/plugin-retry": "^4.1.3", | |||
"@octokit/plugin-throttling": "^6.1.0", | |||
"langchain": "^0.0.195", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The addition of the "langchain" package is noted. However, without a description or context about why this new dependency is needed, it's hard to assess its impact. Please provide more information about why this package is necessary and how it will be used in the project.
+ "langchain": "^0.0.195",
"node_modules/@anthropic-ai/sdk": { | ||
"version": "0.9.1", | ||
"resolved": "https://registry.npmjs.org/@anthropic-ai/sdk/-/sdk-0.9.1.tgz", | ||
"integrity": "sha512-wa1meQ2WSfoY8Uor3EdrJq0jTiZJoKoSii2ZVWRY1oN4Tlr5s59pADg9T79FTbPe1/se5c3pBeZgJL63wmuoBA==", | ||
"dependencies": { | ||
"@types/node": "^18.11.18", | ||
"@types/node-fetch": "^2.6.4", | ||
"abort-controller": "^3.0.0", | ||
"agentkeepalive": "^4.2.1", | ||
"digest-fetch": "^1.3.0", | ||
"form-data-encoder": "1.7.2", | ||
"formdata-node": "^4.3.2", | ||
"node-fetch": "^2.6.7", | ||
"web-streams-polyfill": "^3.2.1" | ||
} | ||
}, | ||
"node_modules/@anthropic-ai/sdk/node_modules/@types/node": { | ||
"version": "18.19.3", | ||
"resolved": "https://registry.npmjs.org/@types/node/-/node-18.19.3.tgz", | ||
"integrity": "sha512-k5fggr14DwAytoA/t8rPrIz++lXK7/DqckthCmoZOKNsEbJkId4Z//BqgApXBUGrGddrigYa1oqheo/7YmW4rg==", | ||
"dependencies": { | ||
"undici-types": "~5.26.4" | ||
} | ||
}, | ||
"node_modules/@anthropic-ai/sdk/node_modules/node-fetch": { | ||
"version": "2.7.0", | ||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.7.0.tgz", | ||
"integrity": "sha512-c4FRfUm/dbcWZ7U+1Wq0AwCyFL+3nt2bEw05wfxSz+DWpWsitgmSgYmy2dQdWyKC1694ELPqMs/YzUSNozLt8A==", | ||
"dependencies": { | ||
"whatwg-url": "^5.0.0" | ||
}, | ||
"engines": { | ||
"node": "4.x || >=6.0.0" | ||
}, | ||
"peerDependencies": { | ||
"encoding": "^0.1.0" | ||
}, | ||
"peerDependenciesMeta": { | ||
"encoding": { | ||
"optional": true | ||
} | ||
} | ||
}, | ||
"node_modules/@anthropic-ai/sdk/node_modules/tr46": { | ||
"version": "0.0.3", | ||
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz", | ||
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==" | ||
}, | ||
"node_modules/@anthropic-ai/sdk/node_modules/webidl-conversions": { | ||
"version": "3.0.1", | ||
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz", | ||
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ==" | ||
}, | ||
"node_modules/@anthropic-ai/sdk/node_modules/whatwg-url": { | ||
"version": "5.0.0", | ||
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz", | ||
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==", | ||
"dependencies": { | ||
"tr46": "~0.0.3", | ||
"webidl-conversions": "^3.0.0" | ||
} | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A new package "@anthropic-ai/sdk" has been added as a dependency. This seems to be a significant change, but there's no explanation provided in the PR description. It would be helpful to understand why this package was added and how it will be used in the project.
+ "node_modules/@anthropic-ai/sdk": {
+ "version": "0.9.1",
+ "resolved": "https://registry.npmjs.org/@anthropic-ai/sdk/-/sdk-0.9.1.tgz",
+ "integrity": "sha512-wa1meQ2WSfoY8Uor3EdrJq0jTiZJoKoSii2ZVWRY1oN4Tlr5s59pADg9T79FTbPe1/se5c3pBeZgJL63wmuoBA==",
+ "dependencies": {
+ "@types/node": "^18.11.18",
+ "@types/node-fetch": "^2.6.4",
+ "abort-controller": "^3.0.0",
+ "agentkeepalive": "^4.2.1",
+ "digest-fetch": "^1.3.0",
+ "form-data-encoder": "1.7.2",
+ "formdata-node": "^4.3.2",
+ "node-fetch": "^2.6.7",
+ "web-streams-polyfill": "^3.2.1"
+ }
+ },
@@ -2136,7 +2199,7 @@ | |||
"version": "1.1.2", | |||
"resolved": "https://registry.npmjs.org/@tootallnate/once/-/once-1.1.2.tgz", | |||
"integrity": "sha512-RbzJvlNzmRq5c3O09UipeuXno4tA1FE6ikOjxZK0tuxVv3412l64l5t1W5pj4+rJq9vpkm/kwiR07aZXnsKPxw==", | |||
"dev": true, | |||
"devOptional": true, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"integrity": "sha512-Dd0BYtWgnWJKwO1jkmTrzofjK2QXXcai0dmtzvIBhcA+RsG5h8R3xlyta0kGOZRNfL9GuRtb1knmPEhQrePCEw==" | ||
}, | ||
"node_modules/@types/node-fetch": { | ||
"version": "2.6.9", | ||
"resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.9.tgz", | ||
"integrity": "sha512-bQVlnMLFJ2d35DkPNjEPmd9ueO/rh5EiaZt2bhqiSarPjZIuIV6bPQVqcrEyvNo+AfTrRGVazle1tl597w3gfA==", | ||
"dependencies": { | ||
"@types/node": "*", | ||
"form-data": "^4.0.0" | ||
} | ||
}, | ||
"node_modules/@types/node-fetch/node_modules/form-data": { | ||
"version": "4.0.0", | ||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz", | ||
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==", | ||
"dependencies": { | ||
"asynckit": "^0.4.0", | ||
"combined-stream": "^1.0.8", | ||
"mime-types": "^2.1.12" | ||
}, | ||
"engines": { | ||
"node": ">= 6" | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A new package "@types/node-fetch" has been added as a dependency. This seems to be a significant change, but there's no explanation provided in the PR description. It would be helpful to understand why this package was added and how it will be used in the project.
+ "node_modules/@types/node-fetch": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.9.tgz",
+ "integrity": "sha512-bQVlnMLFJ2d35DkPNjEPmd9ueO/rh5EiaZt2bhqiSarPjZIuIV6bPQVqcrEyvNo+AfTrRGVazle1tl597w3gfA==",
+ "dependencies": {
+ "@types/node": "*",
+ "form-data": "^4.0.0"
+ }
+ },
+ "node_modules/@types/node-fetch/node_modules/form-data": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
+ "integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
+ "dependencies": {
+ "asynckit": "^0.4.0",
+ "combined-stream": "^1.0.8",
+ "mime-types": "^2.1.12"
+ },
+ "engines": {
+ "node": ">= 6"
+ }
+ },
import {ConversationChain} from 'langchain/chains' | ||
import {ChatOpenAI} from 'langchain/chat_models/openai' | ||
import {BufferMemory} from 'langchain/memory' | ||
import {ChatPromptTemplate, MessagesPlaceholder} from 'langchain/prompts' | ||
import {ChainValues} from 'langchain/schema' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The import statements have been modified to use the langchain
package instead of chatgpt
. This change seems to be in line with the PR's goal of switching to Azure OpenAI. However, it's important to ensure that the new imports provide equivalent functionality to the old ones.
- import {
- ChatGPTAPI,
- ChatGPTError,
- ChatMessage,
- SendMessageOptions
- // eslint-disable-next-line import/no-unresolved
- } from 'chatgpt'
- import pRetry from 'p-retry'
+ import {ConversationChain} from 'langchain/chains'
+ import {ChatOpenAI} from 'langchain/chat_models/openai'
+ import {BufferMemory} from 'langchain/memory'
+ import {ChatPromptTemplate, MessagesPlaceholder} from 'langchain/prompts'
+ import {ChainValues} from 'langchain/schema'
private readonly model: ChatOpenAI | null = null | ||
private readonly api: ConversationChain | null = null |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The api
property has been replaced with model
and api
properties. It's crucial to verify that these new properties are used correctly throughout the class.
- private readonly api: ChatGPTAPI | null = null // not free
+ private readonly model: ChatOpenAI | null = null
+ private readonly api: ConversationChain | null = null
if ( | ||
process.env.AZURE_OPENAI_API_KEY && | ||
process.env.AZURE_OPENAI_API_VERSION && | ||
process.env.AZURE_OPENAI_API_INSTANCE_NAME && | ||
process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME | ||
) { | ||
const currentDate = new Date().toISOString().split('T')[0] | ||
const systemMessage = `${options.systemMessage} | ||
Knowledge cutoff: ${openaiOptions.tokenLimits.knowledgeCutOff} | ||
Current date: ${currentDate} | ||
const systemMessage = `${options.systemMessage} | ||
Knowledge cutoff: ${openaiOptions.tokenLimits.knowledgeCutOff} | ||
Current date: ${currentDate} | ||
|
||
IMPORTANT: Entire response must be in the language with ISO code: ${options.language} | ||
` | ||
const chatPrompt = ChatPromptTemplate.fromMessages([ | ||
['system', systemMessage], | ||
new MessagesPlaceholder('history'), | ||
['human', '{input}'] | ||
]) | ||
|
||
IMPORTANT: Entire response must be in the language with ISO code: ${options.language} | ||
` | ||
|
||
this.api = new ChatGPTAPI({ | ||
apiBaseUrl: options.apiBaseUrl, | ||
systemMessage, | ||
apiKey: process.env.OPENAI_API_KEY, | ||
apiOrg: process.env.OPENAI_API_ORG ?? undefined, | ||
debug: options.debug, | ||
maxModelTokens: openaiOptions.tokenLimits.maxTokens, | ||
maxResponseTokens: openaiOptions.tokenLimits.responseTokens, | ||
completionParams: { | ||
temperature: options.openaiModelTemperature, | ||
model: openaiOptions.model | ||
} | ||
this.model = new ChatOpenAI({ | ||
temperature: options.openaiModelTemperature, | ||
azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY, | ||
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, | ||
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, | ||
azureOpenAIApiDeploymentName: | ||
process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, | ||
timeout: this.options.openaiTimeoutMS, | ||
maxRetries: this.options.openaiRetries | ||
}) | ||
this.api = new ConversationChain({ | ||
memory: new BufferMemory({returnMessages: true, memoryKey: 'history'}), | ||
prompt: chatPrompt, | ||
llm: this.model | ||
}) | ||
} else { | ||
const err = | ||
"Unable to initialize the OpenAI API, both 'OPENAI_API_KEY' environment variable are not available" | ||
"Unable to initialize the OpenAI API, both 'AZURE_OPENAI_API_KEY' environment variable are not available" | ||
throw new Error(err) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The constructor now initializes the model
and api
properties using environment variables related to Azure OpenAI. The error message when these environment variables are not available has also been updated accordingly. This is a significant change and should be thoroughly tested to ensure correct behavior.
- if (process.env.OPENAI_API_KEY) {
- const currentDate = new Date().toISOString().split('T')[0]
- const systemMessage = `${options.systemMessage}
- Knowledge cutoff: ${openaiOptions.tokenLimits.knowledgeCutOff}
- Current date: ${currentDate}
-
- IMPORTANT: Entire response must be in the language with ISO code: ${options.language}
- `
-
- this.api = new ChatGPTAPI({
- apiBaseUrl: options.apiBaseUrl,
- systemMessage,
- apiKey: process.env.OPENAI_API_KEY,
- apiOrg: process.env.OPENAI_API_ORG ?? undefined,
- debug: options.debug,
- maxModelTokens: openaiOptions.tokenLimits.maxTokens,
- maxResponseTokens: openaiOptions.tokenLimits.responseTokens,
- completionParams: {
- temperature: options.openaiModelTemperature,
- model: openaiOptions.model
- }
- })
- } else {
- const err =
- "Unable to initialize the OpenAI API, both 'OPENAI_API_KEY' environment variable are not available"
- throw new Error(err)
- }
+ if (
+ process.env.AZURE_OPENAI_API_KEY &&
+ process.env.AZURE_OPENAI_API_VERSION &&
+ process.env.AZURE_OPENAI_API_INSTANCE_NAME &&
+ process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
+ ) {
+ const currentDate = new Date().toISOString().split('T')[0]
+ const systemMessage = `${options.systemMessage}
+ Knowledge cutoff: ${openaiOptions.tokenLimits.knowledgeCutOff}
+ Current date: ${currentDate}
+
+ IMPORTANT: Entire response must be in the language with ISO code: ${options.language}
+ `
+ const chatPrompt = ChatPromptTemplate.fromMessages([
+ ['system', systemMessage],
+ new MessagesPlaceholder('history'),
+ ['human', '{input}']
+ ])
+
+ this.model = new ChatOpenAI({
+ temperature: options.openaiModelTemperature,
+ azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
+ azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
+ azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
+ azureOpenAIApiDeploymentName:
+ process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
+ timeout: this.options.openaiTimeoutMS,
+ maxRetries: this.options.openaiRetries
+ })
+ this.api = new ConversationChain({
+ memory: new BufferMemory({returnMessages: true, memoryKey: 'history'}),
+ prompt: chatPrompt,
+ llm: this.model
+ })
+ } else {
+ const err =
+ "Unable to initialize the OpenAI API, both 'AZURE_OPENAI_API_KEY' environment variable are not available"
+ throw new Error(err)
+ }
chat = async (message: string): Promise<string> => { | ||
let res: string = '' | ||
try { | ||
res = await this.chat_(message, ids) | ||
res = await this.chat_(message) | ||
return res | ||
} catch (e: unknown) { | ||
if (e instanceof ChatGPTError) { | ||
warning(`Failed to chat: ${e}, backtrace: ${e.stack}`) | ||
} | ||
return res | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The chat
method has been simplified and no longer returns an array. It's important to ensure that all calls to this method have been updated accordingly.
- chat = async (message: string, ids: Ids): Promise<[string, Ids]> => {
- let res: [string, Ids] = ['', {}]
- try {
- res = await this.chat_(message, ids)
- return res
- } catch (e: unknown) {
- if (e instanceof ChatGPTError) {
- warning(`Failed to chat: ${e}, backtrace: ${e.stack}`)
- }
- return res
- }
- }
+ chat = async (message: string): Promise<string> => {
+ let res: string = ''
+ try {
+ res = await this.chat_(message)
+ return res
+ } catch (e: unknown) {
+ return res
+ }
+ }
private readonly chat_ = async (message: string): Promise<string> => { | ||
// record timing | ||
const start = Date.now() | ||
if (!message) { | ||
return ['', {}] | ||
return '' | ||
} | ||
|
||
let response: ChatMessage | undefined | ||
let response: ChainValues | undefined | ||
|
||
if (this.api != null) { | ||
const opts: SendMessageOptions = { | ||
timeoutMs: this.options.openaiTimeoutMS | ||
} | ||
if (ids.parentMessageId) { | ||
opts.parentMessageId = ids.parentMessageId | ||
} | ||
try { | ||
response = await pRetry(() => this.api!.sendMessage(message, opts), { | ||
retries: this.options.openaiRetries | ||
}) | ||
response = await this.api.call({input: message}) | ||
} catch (e: unknown) { | ||
if (e instanceof ChatGPTError) { | ||
info( | ||
`response: ${response}, failed to send message to openai: ${e}, backtrace: ${e.stack}` | ||
) | ||
} | ||
info(`response: ${response}, failed to send message to openai: ${e}`) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The chat_
method has been significantly modified. It no longer uses the pRetry
function or the SendMessageOptions
type from the old chatgpt
package. Instead, it directly calls the api
property's call
method. This change should be thoroughly tested to ensure correct behavior.
- private readonly chat_ = async (
- message: string,
- ids: Ids
- ): Promise<[string, Ids]> => {
- // record timing
- const start = Date.now()
- if (!message) {
- return ['', {}]
- }
-
- let response: ChatMessage | undefined
-
- if (this.api != null) {
- const opts: SendMessageOptions = {
- timeoutMs: this.options.openaiTimeoutMS
- }
- if (ids.parentMessageId) {
- opts.parentMessageId = ids.parentMessageId
- }
- try {
- response = await pRetry(() => this.api!.sendMessage(message, opts), {
- retries: this.options.openaiRetries
- })
- } catch (e: unknown) {
- if (e instanceof ChatGPTError) {
- info(
- `response: ${response}, failed to send message to openai: ${e}, backtrace: ${e.stack}`
- )
- }
- }
- const end = Date.now()
- info(`response: ${JSON.stringify(response)}`)
+ private readonly chat_ = async (message: string): Promise<string> => {
+ // record timing
+ const start = Date.now()
+ if (!message) {
+ return ''
+ }
+
+ let response: ChainValues | undefined
+
+ if (this.api != null) {
+ try {
+ response = await this.api.call({input: message})
+ } catch (e: unknown) {
+ info(`response: ${response}, failed to send message to openai: ${e}`)
+ }
+ const end = Date.now()
+ info(`response: ${JSON.stringify(response)}`)
Summary by CodeRabbit
pull_request
instead ofpull_request_target
, and added a condition to skip the workflow fordependabot[bot]
.package.json
.Bot
class insrc/bot.ts
by removing theChatGPTAPI
import, updating the OpenAI API configuration, and modifying thechat
andchat_
methods.openai_base_url
input from function calls insrc/main.ts
and theapiBaseUrl
property from theOptions
class insrc/options.ts
.src/review-comment.ts
andsrc/review.ts
by removing array destructuring and directly assigning responses fromawait
calls to variables.