Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Idefics3Processor requires the PyTorch library but it was not found in your environment #137

Open
simonw opened this issue Nov 28, 2024 · 0 comments

Comments

@simonw
Copy link

simonw commented Nov 28, 2024

I got this error running mlx-vlm like this:

uv run \
  --with mlx-vlm \
  python -m mlx_vlm.generate \
    --model mlx-community/SmolVLM-Instruct-bf16 \
    --max-tokens 500 \
    --temp 0.5 \
    --prompt "Describe this image in detail" \
    --image IMG_4414.JPG

Output:

None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Fetching 12 files: 100%|█████████████████████| 12/12 [00:00<00:00, 29782.04it/s]
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/mlx_vlm/generate.py", line 111, in <module>
    main()
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/mlx_vlm/generate.py", line 80, in main
    model, processor, image_processor, config = get_model_and_processors(
                                                ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/mlx_vlm/generate.py", line 68, in get_model_and_processors
    model, processor = load(
                       ^^^^^
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/mlx_vlm/utils.py", line 292, in load
    processor = load_processor(model_path, processor_config=processor_config)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/mlx_vlm/utils.py", line 335, in load_processor
    processor = AutoProcessor.from_pretrained(model_path, **processor_config)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/transformers/models/auto/processing_auto.py", line 328, in from_pretrained
    return processor_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1651, in __getattribute__
    requires_backends(cls, cls._backends)
  File "/Users/simon/.cache/uv/archive-v0/uKo-mbuWKgGQJAyh-vU5a/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1639, in requires_backends
    raise ImportError("".join(failed))
ImportError: 
Idefics3Processor requires the PyTorch library but it was not found in your environment. Checkout the instructions on the
installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment.
Please note that you may need to restart your runtime after installation.

Adding --with torch to the above command fixes the error:

uv run \
  --with mlx-vlm \
  --with torch \
  python -m mlx_vlm.generate \
    --model mlx-community/SmolVLM-Instruct-bf16 \
    --max-tokens 500 \
    --temp 0.5 \
    --prompt "Describe this image in detail" \
    --image IMG_4414.JPG
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant