Fix langchain-google-genai integration by stripping 'models/' prefix #3703
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fix langchain-google-genai integration by stripping 'models/' prefix
Summary
Fixes #3702 - langchain-google-genai integration failing with "LLM Provider NOT provided" error.
The issue occurs when using
langchain-google-genai
'sChatGoogleGenerativeAI
with CrewAI. The langchain library stores the model name with amodels/
prefix (e.g.,models/gemini/gemini-pro
) which is used internally by Google's API, but LiteLLM doesn't recognize this format and throws an error.Changes:
create_llm()
inllm_utils.py
to strip themodels/
prefix when extracting model names from langchain-like objectsThe fix is backward compatible - models without the prefix are unchanged.
Review & Testing Checklist for Human
langchain-google-genai
and test the actual failing case from issuelangchain-google-genai
integration fails withLLM Provider NOT provided
on macOS #3702. Create an agent withChatGoogleGenerativeAI(model="gemini/gemini-pro")
and verify it works now.langchain-openai
) to ensure no regressionsModels/
too?)models/
based on the issue)Test Plan Recommendation
Notes
langchain-google-genai
package due to environment constraints. The fix is based on analyzing the error message and creating mock objects that simulate the expected behavior.models/
prefix (case-sensitive) - this seemed like the safe default but should be validated.Link to Devin run: https://app.devin.ai/sessions/f38edeb68fa7484faaa5c6f9d5329afe
Requested by: João ([email protected])