Skip to content

Commit

Permalink
Update README and model.py for Nvidia GPU support
Browse files Browse the repository at this point in the history
README.md: Updated the Nvidia GPU (cuda) status to
:white_check_mark: indicating support. model.py: Added logic to
handle CUDA_VISIBLE_DEVICES, returning the corresponding
quay.io/ramalama/cuda:latest image.

Signed-off-by: Eric Curtin <[email protected]>
  • Loading branch information
ericcurtin committed Jan 3, 2025
1 parent a72b3ab commit 337942c
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ curl -fsSL https://raw.githubusercontent.com/containers/ramalama/s/install.sh |
| Apple Silicon GPU (Linux / Asahi) | :white_check_mark: |
| Apple Silicon GPU (macOS) | :white_check_mark: |
| Apple Silicon GPU (podman-machine) | :white_check_mark: |
| Nvidia GPU (cuda) | :x: [Containerfile](https://github.com/containers/ramalama/blob/main/container-images/cuda/Containerfile) available but not published to quay.io |
| Nvidia GPU (cuda) | :white_check_mark: |
| AMD GPU (rocm) | :white_check_mark: |

## COMMANDS
Expand Down
5 changes: 3 additions & 2 deletions ramalama/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,8 +101,9 @@ def _image(self, args):
gpu_type, _ = get_gpu()
if gpu_type == "HIP_VISIBLE_DEVICES":
return "quay.io/ramalama/rocm:latest"

if gpu_type == "ASAHI_VISIBLE_DEVICES":
elif gpu_type == "CUDA_VISIBLE_DEVICES":
return "quay.io/ramalama/cuda:latest"
elif gpu_type == "ASAHI_VISIBLE_DEVICES":
return "quay.io/ramalama/asahi:latest"

return args.image
Expand Down

0 comments on commit 337942c

Please sign in to comment.