per layer learning rate schedule #19352
Unanswered
jakubMitura14
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello I am fine tuning pretrained model and want to get decreasing learning rate the deeper I get in network. I had found the callback to do this [1] but unluckily for me it give some strange cuda initialization errors. I had aslo managed more manual solution that is presented below, and it works. However, I cannot use the learning rate scheduler with it as learning rate are fixed to layers here. Can I call setup optimizer after each training epoch to adjust base learning rate?
Summarizing - I would like to get each epoch different learning rate by using learning rate scheduler and on the basis of this lr to setup per layer learning rate - as below.
Is it possible?
[1] https://www.bing.com/ck/a?!&&p=3b9f87cb3223045eJmltdHM9MTcwNjIyNzIwMCZpZ3VpZD0xMTY4YmIzZi1lMWE4LTZkNjMtMTVhNC1hOGQ1ZTA4MDZjYTAmaW5zaWQ9NTIwMQ&ptn=3&ver=2&hsh=3&fclid=1168bb3f-e1a8-6d63-15a4-a8d5e0806ca0&psq=pip+finetuning-scheduler&u=a1aHR0cHM6Ly9weXBpLm9yZy9wcm9qZWN0L2ZpbmV0dW5pbmctc2NoZWR1bGVyLw&ntb=1
Beta Was this translation helpful? Give feedback.
All reactions