You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the train.py there is a code of scaling the learning rate:
utils.opt.adjust_learning_rate(optimizer, epoch)
however in the comments you wrote scale it by 1/10 every 30 epoches but your scale is set to 2 by default. is this the way used in the research paper? the scale is 2 epochs or 30 epochs?
def adjust_learning_rate(optimizer, epoch, scale=2):
# Sets the learning rate to the initial LR decayed by 10 every 30 epochs
for param_group in optimizer.param_groups:
lr = param_group['lr']
lr = lr * (0.1 ** (epoch // scale))
param_group['lr'] = lr
The text was updated successfully, but these errors were encountered:
In the train.py there is a code of scaling the learning rate:
utils.opt.adjust_learning_rate(optimizer, epoch)
however in the comments you wrote scale it by 1/10 every 30 epoches but your scale is set to 2 by default. is this the way used in the research paper? the scale is 2 epochs or 30 epochs?
The text was updated successfully, but these errors were encountered: