You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 28, 2022. It is now read-only.
I am just trying to get the TensorRT server started and on two different servers with fresh downloads of the GRE, they are just getting stuck at the Initializing TensorRT state. I have been able to get the GRE caffe server code up and running. I have tried clearing the cache for Docker to no success.
I'm running this on DGX-1 (16gb version) volta. I'm wondering if tensorRT 2 may not work correctly with this GPU.
The text was updated successfully, but these errors were encountered:
You likely need to take the latest version of TensorRT instead, yes. You will get better performance. You will probably need to make small edits to the Dockefile for that.
I have already worked with the most recent release of TensorRT and Inference server and found that the linux implementation works, but the inference client does not work in windows in its current form. My workflow is an application that interfaces uses JSON data.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
gpu-rest-engine-master$ nvidia-docker run --name=server --net=host --rm inference_server
2018/09/18 02:31:30 Initializing TensorRT classifiers
I am just trying to get the TensorRT server started and on two different servers with fresh downloads of the GRE, they are just getting stuck at the Initializing TensorRT state. I have been able to get the GRE caffe server code up and running. I have tried clearing the cache for Docker to no success.
I'm running this on DGX-1 (16gb version) volta. I'm wondering if tensorRT 2 may not work correctly with this GPU.
The text was updated successfully, but these errors were encountered: