Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ramalama should detect if it is missing dependencies like huggingface_hub #688

Closed
miabbott opened this issue Jan 31, 2025 · 2 comments · Fixed by #719
Closed

ramalama should detect if it is missing dependencies like huggingface_hub #688

miabbott opened this issue Jan 31, 2025 · 2 comments · Fixed by #719

Comments

@miabbott
Copy link

I installed ramalama using pip install ramalama on my Fedora 41 system.

$ pip install ramalama               
Defaulting to user installation because normal site-packages is not writeable
Collecting ramalama                                          
  Downloading ramalama-0.5.4-py3-none-any.whl.metadata (13 kB)               
Requirement already satisfied: argcomplete in /usr/lib/python3.13/site-packages (from ramalama) (3.5.3)                                                                                                                                                                                                                                                                                      
Downloading ramalama-0.5.4-py3-none-any.whl (63 kB)                                                                                                                                           
Installing collected packages: ramalama                                                                                                                                                       
Successfully installed ramalama-0.5.4          

I tried to run my first model from HuggingFace with ramalama run, only to be met with an error about a missing Python module:

$  ramalama run huggingface://ibm-granite/granite-3.1-8b-instruct
Traceback (most recent call last):
  File "/var/home/miabbott/.local/bin/huggingface-cli", line 5, in <module>
    from huggingface_hub.commands.huggingface_cli import main
ModuleNotFoundError: No module named 'huggingface_hub'
Error: Command '['huggingface-cli', 'download', '--local-dir', '/var/home/miabbott/.local/share/ramalama/repos/huggingface/ibm-granite/granite-3.1-8b-instruct', 'ibm-granite/granite-3.1-8b-instruct']' returned non-zero exit status 1.

I assumed that ramalama would have already had this dependency installed, as I was not able to see any instructions in the README that says you need to install additional modules.

I only found this reference about pip install huggingface_hub in the code -

missing_huggingface = """
Optional: Huggingface models require the huggingface-cli module.
These modules can be installed via PyPi tools like pip, pip3, pipx, or via
distribution package managers like dnf or apt. Example:
pip install huggingface_hub

Since the goal of the project is to "make working with AI boring", can the CLI be improved to detect the missing dependency and prompting the user to install it?

(After manually installing the huggingface_hub module, the run command was successful)

@rhatdan
Copy link
Member

rhatdan commented Feb 1, 2025

This looks like ramalama found a hugginface_cli in your command path, and then executed it, which threw the error about hugginface_cli.

Ramalama would print the following if hugginface_cli was not found.

missing_huggingface = """
Optional: Huggingface models require the huggingface-cli module.
These modules can be installed via PyPi tools like pip, pip3, pipx, or via
distribution package managers like dnf or apt. Example:
pip install huggingface_hub
"""

@miabbott
Copy link
Author

miabbott commented Feb 2, 2025

@rhatdan That is weird; no idea where huggingface-cli came from in my environment.

However, the behavior still doesn't look correct. After removing the previous install and creating a fresh Python venv, I got an error about missing the huggingface-cli, but nothing about installing the module.

$ python -m venv ramalama_tmp
$ source ramalama_tmp/bin/activate

(ramalama_tmp) $ ramalama
bash: ramalama: command not found
(ramalama_tmp) $ huggingface-cli
bash: huggingface-cli: command not found

(ramalama_tmp) $ pip install ramalama
Collecting ramalama
  Downloading ramalama-0.5.5-py3-none-any.whl.metadata (13 kB)
Collecting argcomplete (from ramalama)
  Downloading argcomplete-3.5.3-py3-none-any.whl.metadata (16 kB)
Downloading ramalama-0.5.5-py3-none-any.whl (68 kB)
Downloading argcomplete-3.5.3-py3-none-any.whl (43 kB)
Installing collected packages: argcomplete, ramalama
Successfully installed argcomplete-3.5.3 ramalama-0.5.5

[notice] A new release of pip is available: 24.2 -> 25.0
[notice] To update, run: pip install --upgrade pip

(ramalama_tmp) $ ramalama
usage: ramalama [-h] [--container] [--debug] [--dryrun] [--engine ENGINE] [--gpu] [--image IMAGE] [--nocontainer] [--runtime {llama.cpp,vllm}] [--store STORE] [-v] {help,bench,benchmark,containers,ps,convert,info,list,ls,login,logout,perplexity,pull,push,rm,run,serve,stop,version} ...
ramalama: requires a subcommand

(ramalama_tmp) $ ramalama run huggingface://ibm-granite/granite-3.1-8b-instruct
URL pull failed and huggingface-cli not available
Error: Failed to pull model: 'failed to pull https://huggingface.co/ibm-granite/raw/main/granite-3.1-8b-instruct: HTTP Error 401: Unauthorized

(ramalama_tmp) $ huggingface-cli
bash: huggingface-cli: command not found

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants