You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to measure speed of doing inference on a single image with SqueezeNet. When I run it on CPU, SqueezeNet seems fast enough (comparing to VGG). But when it is on GPU, SqueezeNet gets very slower, even slower than CPU.
Does anyone know why it gets slow on GPU? Should I do something on SqueezeNet when I run it on GPU?
Here are some results of experiments I have made for SqueezeNet vs VGG in terms of their speeds both on CPU and GPU.
On CPU, SqueezeNet is much faster than VGG16.
[inference time]
VGG average response time: 2.21110591888[sec/image]
SqueezeNet average response time: 0.288291954994[sec/image]
On GPU, VGG16 gets really faster, even faster than SqueezeNet. And SqueezeNet gets even slower than it on CPU.
[inference time]
VGG16 average response time: 0.0961683591207[sec/image] # get very fast
SqueezeNet average response time: 1.50337402026[sec/image] # get very slow <= why?
Thanks!
The text was updated successfully, but these errors were encountered:
I'm trying to measure speed of doing inference on a single image with SqueezeNet. When I run it on CPU, SqueezeNet seems fast enough (comparing to VGG). But when it is on GPU, SqueezeNet gets very slower, even slower than CPU.
Does anyone know why it gets slow on GPU? Should I do something on SqueezeNet when I run it on GPU?
Here are some results of experiments I have made for SqueezeNet vs VGG in terms of their speeds both on CPU and GPU.
On CPU, SqueezeNet is much faster than VGG16.
On GPU, VGG16 gets really faster, even faster than SqueezeNet. And SqueezeNet gets even slower than it on CPU.
Thanks!
The text was updated successfully, but these errors were encountered: