-
Notifications
You must be signed in to change notification settings - Fork 255
Add GitHub Copilot LLM Backend Integration #5693
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great!
} | ||
|
||
// loadGitHubToken loads the saved GitHub access token | ||
func loadGitHubToken() (string, error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit - do we have this kind of token cache in enough other places that it's worth its own little wrapper? It'd be nice to treat all of our token/cred caches the same if possible.
} | ||
|
||
// deviceCodeFlow performs the GitHub device code authentication flow | ||
func deviceCodeFlow(ctx context.Context, console input.Console) (string, error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit - ditto on whether this Device Code flow could be packaged up for reuse elsewhere.
ghCpModel, err := openai.New( | ||
openai.WithToken(tokenData.Token), | ||
openai.WithBaseURL(githubCopilotApi), | ||
openai.WithAPIType(openai.APITypeOpenAI), // GitHub Copilot uses the OpenAI API type | ||
openai.WithModel("gpt-4"), | ||
openai.WithHTTPClient(&httpClient{}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have the ability to pass a version here?
|
||
callOptions := []llms.CallOption{} | ||
ghCpModel.CallbacksHandler = modelContainer.logger | ||
callOptions = append(callOptions, llms.WithTemperature(1.0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this the temp we're using elsewhere? Feels really high. If nothing else we might want this to be configurable.
newToken, err := newCopilotToken(githubToken) | ||
if err != nil { | ||
// If Copilot token request fails, GitHub token might be expired | ||
if strings.Contains(err.Error(), "status 401") || strings.Contains(err.Error(), "status 403") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any chance a nested error could cause this logic to break if it contains that text (possibly based on a user asking GHCP about "status 401")? Is it worth matching your entire line of text like strings.HasPrefix(err.Error(), "copilot API error (status 401):")
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving for EngSys
Azure Dev CLI Install InstructionsInstall scriptsMacOS/Linux
bash:
pwsh:
WindowsPowerShell install
MSI install
Standalone Binary
MSI
Documentationlearn.microsoft.com documentationtitle: Azure Developer CLI reference
|
Overview
This PR adds GitHub Copilot as a new LLM backend for the Azure Developer CLI (azd) agent, enabling developers to use GitHub Copilot's language models for AI-assisted development workflows within azd.
Changes
Core Integration
pkg/llm/github_copilot.go
): Complete implementation of GitHub Copilot as an LLM backendAuthentication Flow
The implementation includes a complete OAuth device flow authentication system:
Technical Implementation Details
OpenAI API Reuse
We reuse the OpenAI model structure from langchaingo because:
Token Management Strategy
~/.azd/gh-cp/gh
(JSON format)~/.azd/gh-cp/cp
(JSON format with expiration)Integration Points
cmd/container.go
): Registers the new provider as "github-copilot"internal/agent/agent_factory.go
): Updates context handling for authenticationcmd/init.go
): Enables GitHub Copilot for AI-assisted project initializationpkg/llm/model_factory.go
): Adds context parameter for authentication flowspkg/llm/manager.go
): Adds new LlmTypeGhCp constant and string representationSecurity Considerations
Benefits for azd Users
Resolves
Closes #5679 - Copilot integration
Testing
Configuration Example
Users can configure GitHub Copilot as their default LLM backend:
azd config set ai.model.type github-copilot
The first time they use an AI-assisted command, they'll be prompted to authenticate with GitHub, after which tokens are automatically managed.