Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When did the optimizer switch to SGD? #25

Open
yunbujian opened this issue Dec 14, 2021 · 1 comment
Open

When did the optimizer switch to SGD? #25

yunbujian opened this issue Dec 14, 2021 · 1 comment

Comments

@yunbujian
Copy link

I set the initial lr=0.0001, final_lr=0.1,
but I still don't know when the optimizer will become SGD.
Do I need to improve my learning rate to the final learning rate manually?
thanks!

@jgvinholi
Copy link

There is no hard switch, but instead it is a smooth transition between the behavior of Adam and SGD.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants