-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama AssertionError for cli/python #48
Comments
cc @init27 @heyjustinai just fyi if you see this more in the wild, please report. this seems odd and seems like we had tested this path a bunch! |
@ashwinb FYI it works for this endpoint, so it is possibly an Ollama issue
|
@Mikkicon can you tell me
|
I am facing the same error This is my machine info, I am running in github codespace - Command for running the docker |
I followed a zero_to_hero_guide and am facing this issue for
llama-stack-client
$
llama-stack-client --endpoint http://localhost:5001 inference chat-completion --message "hello, what model are you?"
ollama docker
The text was updated successfully, but these errors were encountered: