-
Notifications
You must be signed in to change notification settings - Fork 18.3k
feat(ollama): warn on empty load
responses
#32161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
Co-authored-by: mdrxy <[email protected]>
Co-authored-by: mdrxy <[email protected]>
Co-authored-by: mdrxy <[email protected]>
CodSpeed WallTime Performance ReportMerging #32161 will not alter performanceComparing
|
CodSpeed Instrumentation Performance ReportMerging #32161 will not alter performanceComparing Summary
|
load
responses
load
responsesload
responses
@copilot instead of skipping on empty |
Co-authored-by: mdrxy <[email protected]>
Done! I've modified the implementation to emit warnings when empty |
Problem
When using
ChatOllama
withcreate_react_agent
, agents would sometimes terminate prematurely with empty responses when Ollama returneddone_reason: 'load'
responses with no content. This caused agents to return emptyAIMessage
objects instead of actual generated text.Root Cause
The
_iterate_over_stream
and_aiterate_over_stream
methods treated any response withdone: True
as final, regardless ofdone_reason
. When Ollama returnsdone_reason: 'load'
with empty content, it indicates the model was loaded but no actual generation occurred - this should not be considered a complete response.Solution
Modified the streaming logic to skip responses when:
done: True
done_reason: 'load'
This ensures agents only receive actual generated content while preserving backward compatibility for load responses that do contain content.
Changes
_iterate_over_stream
: Skip empty load responses instead of yielding them_aiterate_over_stream
: Apply same fix to async streamingTesting
All scenarios now work correctly:
create_react_agent
integration fixedFixes #31482.
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.