Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do i use a local ollama model as my model here? #21

Open
Travis-Barton opened this issue Feb 22, 2025 · 1 comment
Open

How do i use a local ollama model as my model here? #21

Travis-Barton opened this issue Feb 22, 2025 · 1 comment

Comments

@Travis-Barton
Copy link

This is great and looks super powerful, but the cost of so many tokens at scale is going to be huge for heavy use. One great way to cut costs is to use local models + local search (I'll save search for another time) but how can I use my local ollama models with this?

@rlancemartin
Copy link
Collaborator

Hi! I have a simpler version below that uses Ollama. The main differences are that 1) it skips the planning phase and 2) it doesn't perform section writing in parallel.
https://github.com/langchain-ai/ollama-deep-researcher

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants