Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scaling of learning rate #9

Open
nexus1203 opened this issue Jun 12, 2024 · 0 comments
Open

Scaling of learning rate #9

nexus1203 opened this issue Jun 12, 2024 · 0 comments

Comments

@nexus1203
Copy link

In the train.py there is a code of scaling the learning rate:

utils.opt.adjust_learning_rate(optimizer, epoch)

however in the comments you wrote scale it by 1/10 every 30 epoches but your scale is set to 2 by default. is this the way used in the research paper? the scale is 2 epochs or 30 epochs?

def adjust_learning_rate(optimizer, epoch, scale=2):
    # Sets the learning rate to the initial LR decayed by 10 every 30 epochs
    for param_group in optimizer.param_groups:
        lr =  param_group['lr']
        lr = lr * (0.1 ** (epoch // scale))
        param_group['lr'] = lr
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant