Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do I just need to implement the softmax layer ? #14

Open
ljq19950723 opened this issue Mar 22, 2018 · 9 comments
Open

Do I just need to implement the softmax layer ? #14

ljq19950723 opened this issue Mar 22, 2018 · 9 comments

Comments

@ljq19950723
Copy link

Firstly,thank you for your code.And I compared the pluginImplement.cpp of your code with the samplePlugin.cpp of TensorRT's samplePlugin.Do I only need to implement the softmax layer ?And the mobilenet can work on TensorRT.

@chenzhi1992
Copy link
Owner

You need to implement the depthwise layer, too.

@ljq19950723
Copy link
Author

And Do I need to implement the class SoftmaxPlugin's functions that inherited from the parent class only?Or any other functions?

@chenzhi1992
Copy link
Owner

Yes, you need to implement the SoftmaxPlugin

@xchani
Copy link

xchani commented Apr 5, 2018

I found the channels of output blobs from plugin reshape layer is actually the number of classes, hence off-the-shelf across-channel softmax layer can satisfy demand. @chenzhi1992

@chenzhi1992
Copy link
Owner

Allright, you can verify that the result is correct.

@twmht
Copy link

twmht commented Apr 11, 2018

@chenzhi1992

Do you have plan to release the code of softmax and depthwise layer?

@linux-devil
Copy link

@xchani Did off-the-shelf across-channel softmax worked for you ?

@xchani
Copy link

xchani commented Apr 20, 2018

@linux-devil NO. Though off-the-shelf softmax and my own implemented softmax layer both operate across-channel, off-the-shelf softmax still produce wrong results. I cannot figure it out why it happends.

@Optimus1072
Copy link

@xchani Did it work with your softmax implementation. Can you please share it and explain why the dimension of last layer is 12764?.
Thanks :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants