You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is currently no file in the example directory showcasing using ollama to power Legion agents.
Proposed Solution
We need a simple example file, with one or two tools, and some basic message sending. Like in examples/agents/basic_agent.py
Alternative Solutions
n/a
Additional Context
This ticket has a dependency on getting the Ollama provider fully tested and working in all four modes of operation.
Implementation Ideas
Benefits
Helps developers interested in using Legion with locally running models see how to do it with agents.
Potential Challenges
When using small local models like llama3.3 8b, things like tool use (more broadly function calling) isn't nearly as accurate, so there may be erroneous issues not by fault of the code, but rather by the hardware limitation of the computer running it.
The text was updated successfully, but these errors were encountered:
Problem Statement
There is currently no file in the example directory showcasing using ollama to power Legion agents.
Proposed Solution
We need a simple example file, with one or two tools, and some basic message sending. Like in
examples/agents/basic_agent.py
Alternative Solutions
n/a
Additional Context
This ticket has a dependency on getting the Ollama provider fully tested and working in all four modes of operation.
Implementation Ideas
Benefits
Helps developers interested in using Legion with locally running models see how to do it with agents.
Potential Challenges
When using small local models like llama3.3 8b, things like tool use (more broadly function calling) isn't nearly as accurate, so there may be erroneous issues not by fault of the code, but rather by the hardware limitation of the computer running it.
The text was updated successfully, but these errors were encountered: