This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
Extract gradients of the hidden layers after inference #14724
Unanswered
adelshafiei
asked this question in
Q&A
Replies: 2 comments
-
Hey, this is the MXNet Label Bot. |
Beta Was this translation helpful? Give feedback.
0 replies
-
@mxnet-label-bot add [question, python] |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Note: Providing complete information in the most concise form is the best way to get help. This issue template serves as the checklist for essential information to most of the technical issues and bug reports. For non-technical issues and feature requests, feel free to present the information in what you believe is the best form.
For Q & A and discussion, please start a discussion thread at https://discuss.mxnet.io
Description
(Brief description of the problem in no more than 2 sentences.)
autograd,record() does not work when a model is loaded
Environment info (Required)
MXNet1.2.1-cu90
Package used (Python/R/Scala/Julia):
(I'm using ...)
For Scala user, please provide:
java -version
)mvn -version
)scala -version
)For R user, please provide R
sessionInfo()
:Build info (Required if built from source)
Compiler (gcc/clang/mingw/visual studio):
MXNet commit hash:
(Paste the output of
git rev-parse HEAD
here.)Build config:
(Paste the content of config.mk, or the build command.)
Error Message:
(Paste the complete error message, including stack trace.)
Traceback (most recent call last):
File "main.py", line 167, in
out.backward(retain_graph=True, train_mode=False)
File "/home/adel/.local/lib/python3.6/site-packages/mxnet/ndarray/ndarray.py", line 2200, in backward
ctypes.c_void_p(0)))
File "/home/adel/.local/lib/python3.6/site-packages/mxnet/base.py", line 252, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: [02:50:45] src/imperative/imperative.cc:293: Check failed: !AGInfo::IsNone(*i) Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward.
Minimum reproducible example
(If you are using your own code, please provide a short script that reproduces the error. Otherwise, please provide link to the existing example.)
with autograd.record(train_mode=False):
sym, arg_params, aux_params = mx.model.load_checkpoint('resnet-18', 0)
mod = mx.mod.Module(symbol=sym, label_names=None, context=ctx)
mod.bind(for_training = False,
data_shapes=[('data', (1,3,224,224))], label_shapes=mod._label_shapes, grad_req='write')
mod.set_params(arg_params, aux_params, allow_missing=True)
out = mod.predict(image)
out.backward(retain_graph=True, train_mode=False) -> returns the error mentioned above
Steps to reproduce
(Paste the commands you ran that produced the error.)
What have you tried to solve it?
I am trying to find the gradients of the hidden layers after loading a checkpoint and running one inference. I call autograd.record(), however, I get an error if I call backward. If I do not call backward, gradient is not calculated and it shows zeros.
Beta Was this translation helpful? Give feedback.
All reactions