-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'ramalama ps' returns exception on macOS when no container-based llms are running #488
Comments
Sounds like a good idea to me! I guess we just grep the pids in python and display that. Wanna take a go at implementing this @planetf1 ? Thanks for the feedback! |
And we should try and make this technique portable so that it works across macOS and Linux (whilst adding no dependancies). If we must we can do: if macos: |
Did you have neither docker podman installed? |
I'll recheck as it's been a while. I do have 'podman' installed. |
Just rechecked with the latest ramalama code, and get the same result. Both |
|
If I explicitly set the container engine it works ie:
This even works if set to docker (since there is an alias now setup - actually a filesystem link docker->podman)
It's only when not set this fails -- so the issue must be in the detection logic. ie.
|
So looking at detection, this line seems to be problematic since
I'm not familar with krunkit - and am using podman desktop 1.15. That being said podman was originally installed (and still is) via homebrew, so there may be some discrepancy here? (though all seems to work) |
krunkit is pretty new, it's required for accelerated AI workloads within podman machined. |
If krunkit doesn't exist we should probably just fall back to no containers. Although if you want to do all the cool containers stuff on macOS I highly recommend krunkit. A simple check for "/opt/podman/bin/krunkit" might be enough |
Just reading about it. Optional not default? containers/podman#22580 (reply in thread) So ideally ramalama would detect the applehv configuration and advise krunkit is used? |
Or we could simplify the check to effectively:
ramalama will work within applehv, it just won't be accelerated |
I can confirm that with a krunkit vm ramalama works without further configuration. In my case the confusion arose from having installed podman via 'homebrew' prior to podman desktop. So many macOS users make use of homebrew, so I imagine that's quite typical. This distribution doesn't seem to include krunkit (not sure if that may change, or if versions specific, though mine was up to date) Secondly, krunkit isn't the default, even when using podman desktop - though it is easy to select. I would suggest that there's several options
|
It's not a sufficient check because if podman machine isn't started:
|
Open to PRs in general to make this better! |
This should fix part of the issue: |
On macOS Sonama I have one model running. This appears not to be running as a container but is working well, with Apple Silicon metal/gpu support.
podman ps
shows a variety of other containers (of mine) running, but no llms from ralalamaThe command should not hit an exception - either it should report that no containers are running, or -- better -- would be to include any non-container llms being served. In my case that is
Here's the failing command:
The text was updated successfully, but these errors were encountered: