Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using other AI model Ollama #184

Open
AlazaziAmr opened this issue Oct 28, 2024 · 10 comments
Open

Using other AI model Ollama #184

AlazaziAmr opened this issue Oct 28, 2024 · 10 comments

Comments

@AlazaziAmr
Copy link

AlazaziAmr commented Oct 28, 2024

While using Ollama (with llama) the outputs/responses are not good and accurate as openai, is there a special code implementation in ai_assistants.py file to use Ollama(with llama)?

@fjsj
Copy link
Member

fjsj commented Oct 28, 2024

Hi @AlazaziAmr I believe that's the model fault, not anything specific caused by django-ai-assistant.
But you can double-check what's the input being passed to the model by using Langsmith: https://docs.smith.langchain.com/
All you have to do is set the env vars:

LANGCHAIN_TRACING_V2=true
LANGCHAIN_API_KEY=<your-api-key>

If you're using the example projects, set those at your .env file.

@AlazaziAmr
Copy link
Author

Hi @fjsj, everything is set correctly, but the Ollama(with llama) can't find the specific function to do the work as the OpenAI.
Maybe it's as you said "the model fault".

@fjsj
Copy link
Member

fjsj commented Oct 28, 2024

the specific ollama model you're using is really fine-tuned to perform function calling? @AlazaziAmr

@AlazaziAmr
Copy link
Author

As mentioned in their website it supports tool calling. @fjsj
ex
same example showen in the above picture performed with gpt-4o and work perfectly.

@fjsj
Copy link
Member

fjsj commented Oct 28, 2024

Did you double-check with Langsmith or with logs the request made to the model? Are the function tools being passed?

@AlazaziAmr
Copy link
Author

AlazaziAmr commented Oct 28, 2024

No the function tools are not passed

@fjsj
Copy link
Member

fjsj commented Oct 28, 2024

What ChatModel are you using? Should be one that supports "Tool calling" from this table: https://python.langchain.com/docs/integrations/chat/#featured-providers

Is it really ChatOllama?

@AlazaziAmr
Copy link
Author

AlazaziAmr commented Oct 28, 2024

Yes it's ChatOllama, and I think the function tools are passed acording to the this picture
langsmith
but maybe the model does not know it !?

@fjsj
Copy link
Member

fjsj commented Oct 28, 2024

Yes, it seems right. Looks like it's a model problem.

@AlazaziAmr
Copy link
Author

Yes, I'll try to dig deeper into it, and find the problem, and let you know about the updates.

Thank you so much for your time, appreciate it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants