This repository was archived by the owner on Nov 17, 2023. It is now read-only.
Replies: 2 comments
-
@apache/mxnet-committers: This issue has been inactive for the past 90 days. It has no label and needs triage. For general "how-to" questions, our user forum (and Chinese version) is a good place to get help. |
Beta Was this translation helpful? Give feedback.
0 replies
-
@efan3000 Would you kindly paste a code sample that reproduces the issue, so that we can best answer your question? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I added following 2 layers in a classic network:
mask = mx.symbol.broadcast_div(name='div', *[relu, relu])
relu2 = mx.symbol.broadcast_mul(name='relu2', *[relu, mask])
, and found the loss refuses to decrease. In my knowledge, the networks with or without above layers should be same. I tried similar thing in Caffe, it works. I have no idea why it cannot work in MXNET.
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions