You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello @rishikksh20 , thanks for your sharing. I have some issues in training model.
hs = self.length_regulator(hs, ds, ilens) # (B, Lmax, adim)
File "/data/tuong/Yen/AdaSpeech/env/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/data/tuong/Yen/AdaSpeech/core/duration_modeling/length_regulator.py", line 63, in forward
xs = [self._repeat_one_sequence(x, d) for x, d in zip(xs, ds)]
File "/data/tuong/Yen/AdaSpeech/core/duration_modeling/length_regulator.py", line 63, in <listcomp>
xs = [self._repeat_one_sequence(x, d) for x, d in zip(xs, ds)]
File "/data/tuong/Yen/AdaSpeech/core/duration_modeling/length_regulator.py", line 93, in _repeat_one_sequence
out.append(x_.repeat(int(d_), 1))
RuntimeError: Trying to create tensor with negative dimension -6: [-6, 256]
and gradient is not updated WARNING - grad norm is nan. Do not update model.
Can you help me solve this issue? Thanks you.
The text was updated successfully, but these errors were encountered:
Hello @rishikksh20 , thanks for your sharing. I have some issues in training model.
and gradient is not updated
WARNING - grad norm is nan. Do not update model.
Can you help me solve this issue? Thanks you.
The text was updated successfully, but these errors were encountered: