We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
requirement error: invalid expression
# docker run -ti --rm --name local-ai --gpus all -v $XDG_DATA_HOME/localai/models:/build/models:cached -p 8080:8080 localai/localai:latest
# docker run -ti --rm --name local-ai
--gpus all
-v $XDG_DATA_HOME/localai/models:/build/models:cached -p 8080:8080 localai/localai:latest
docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running prestart hook #0: exit status 1, stdout: , stderr: Auto-detected mode as 'legacy' nvidia-container-cli: requirement error: invalid expression: unknown.
# docker run --gpus all -it --rm nvidia/cuda:12.2.0-base-ubuntu22.04 nvidia-smi works
# docker run --gpus all -it --rm nvidia/cuda:12.2.0-base-ubuntu22.04 nvidia-smi
The text was updated successfully, but these errors were encountered:
No branches or pull requests
# docker run -ti --rm --name local-ai
--gpus all
-v $XDG_DATA_HOME/localai/models:/build/models:cached -p 8080:8080 localai/localai:latest
# docker run --gpus all -it --rm nvidia/cuda:12.2.0-base-ubuntu22.04 nvidia-smi
worksThe text was updated successfully, but these errors were encountered: