-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ramalama should detect if it is missing dependencies like huggingface_hub
#688
Comments
This looks like ramalama found a hugginface_cli in your command path, and then executed it, which threw the error about hugginface_cli. Ramalama would print the following if hugginface_cli was not found. missing_huggingface = """ |
@rhatdan That is weird; no idea where However, the behavior still doesn't look correct. After removing the previous install and creating a fresh Python
|
I installed
ramalama
usingpip install ramalama
on my Fedora 41 system.I tried to run my first model from HuggingFace with
ramalama run
, only to be met with an error about a missing Python module:I assumed that
ramalama
would have already had this dependency installed, as I was not able to see any instructions in the README that says you need to install additional modules.I only found this reference about
pip install huggingface_hub
in the code -ramalama/ramalama/huggingface.py
Lines 7 to 11 in 5b1f1a1
Since the goal of the project is to "make working with AI boring", can the CLI be improved to detect the missing dependency and prompting the user to install it?
(After manually installing the
huggingface_hub
module, therun
command was successful)The text was updated successfully, but these errors were encountered: