Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is Nvidia/CUDA passthru supposed to work on non NixOS Linux hosts? #69

Open
mkoloberdin opened this issue Nov 22, 2023 · 4 comments
Open

Comments

@mkoloberdin
Copy link

mkoloberdin commented Nov 22, 2023

I this a thing? The Readme mentions that it's supposed to work on WSL, but what about other Linux hosts? I am using Arch Linux.

Any pointers where to look to try to make it work or at least to figure out if it would work or not at all?

@MatthewCroughan
Copy link
Member

It should, I mentioned in the video that things like invokeai work on a steam deck. But that was with AMD, can you please try and report back?

@mkoloberdin
Copy link
Author

Textgen errors out while attempting to load a model. Unless I tick "cpu."
Here is the log:

2023-11-22 17:38:32 ERROR:Failed to load the model.
Traceback (most recent call last):
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/ui_model_menu.py", line 201, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(shared.model_name, loader)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/models.py", line 79, in load_model
    output = load_func_map[loader](model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/models.py", line 141, in huggingface_loader
    model = model.cuda()
            ^^^^^^^^^^^^
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2243, in cuda
    return super().cuda(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/nn/modules/module.py", line 905, in cuda
    return self._apply(lambda t: t.cuda(device))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/nn/modules/module.py", line 797, in _apply
    module._apply(fn)
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/nn/modules/module.py", line 797, in _apply
    module._apply(fn)
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/nn/modules/module.py", line 797, in _apply
    module._apply(fn)
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/nn/modules/module.py", line 820, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/nn/modules/module.py", line 905, in <lambda>
    return self._apply(lambda t: t.cuda(device))
                                 ^^^^^^^^^^^^^^
  File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 247, in _lazy_init
    torch._C._cuda_init()
RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx

The driver is installed on the host, nvidia-smi works fine.

@MatthewCroughan
Copy link
Member

@mkoloberdin could you join the nixified.ai matrix chat to debug this further?

@mdietrich16
Copy link

Hey, having the same problem!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants