Skip to content

Conversation

its-shashankY
Copy link

@its-shashankY its-shashankY commented Sep 28, 2025

Description

This PR fixes the OutputParserException that occurs when using Ollama models instead of Gemini models.

Problem

When switching from Gemini to Ollama, users encountered OutputParserException due to different response formats between the two LLM providers. The output parser was expecting Gemini-specific formatting.

Solution

  • Updated the output parser to handle multiple response formats
  • Added fallback parsing logic for Ollama responses
  • Improved error messages for debugging
  • Added validation for different LLM response structures

Changes Made

  • Modified langchain/output_parsers/base.py to handle Ollama response format
  • Updated error handling in the parser chain
  • Added unit tests for Ollama compatibility
  • Updated documentation examples

Testing

  • Existing tests pass
  • Added new tests for Ollama compatibility
  • Manually tested with both Gemini and Ollama models
  • Verified backward compatibility

Fixes

Closes #33016

Checklist

  • My code follows the project's style guidelines
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective
  • New and existing unit tests pass locally with my changes

- Fixed output parsing compatibility issues between Ollama and Gemini models
- Updated output parser to handle different response formats from Ollama
- Added proper error handling for malformed responses
- Ensured consistent behavior across different LLM providers

Fixes langchain-ai#33016
Copy link

vercel bot commented Sep 28, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
langchain Ready Ready Preview Comment Sep 28, 2025 2:54pm

@github-actions github-actions bot added the integration Related to a provider partner package integration label Sep 28, 2025
Copy link

codspeed-hq bot commented Sep 28, 2025

CodSpeed Instrumentation Performance Report

Merging #33140 will not alter performance

Comparing its-shashankY:fix/33016-ollama-output-parser-exception (0bc6982) with master (9863023)1

Summary

✅ 1 untouched
⏩ 20 skipped2

Footnotes

  1. No successful run was found on master (54ea620) during the generation of this report, so 9863023 was used instead as the comparison base. There might be some changes unrelated to this pull request in this report.

  2. 20 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
integration Related to a provider partner package integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

OutputParserException when using Ollama instead of Gemini
1 participant