How to redefine optimizer for pl.LightningModule? #9354
-
The great advantage and convenient feature of PyTorch-Lightning over vanilla Pytorch is the more convenient interface for optimization loops. Instead of reinventing the wheel every time with manual The setup of the optimizer is done in the However, it seems, that this choice fixes the optimizer properties in the definition of the class, and often you would like to try several optimizers and change their params. One way to do it - is to pass additional arguments to the constructor of Also one may try to access directly the optimizers outside the class, but probably it is an unsafe approach. Is there some alternative way to reconfigure optimizer, like in |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Hi @Godofnothing , I am afraid there is no clean way to do so. What I personally do to hack around this (if I need it) is that I have arguments to the model that define what is to be returned by |
Beta Was this translation helpful? Give feedback.
-
I've come up with this kind of solution (dangerous and breaking the law, but superficially woking one). Add this two methods to the
And the example of the use case:
|
Beta Was this translation helpful? Give feedback.
Hi @Godofnothing ,
I am afraid there is no clean way to do so. What I personally do to hack around this (if I need it) is that I have arguments to the model that define what is to be returned by
configure_optimizers
and then callself.trainer.accelerator.setup_optimizers(self.trainer)
in any of the other hooks. That makes sureconfigure_optimizers
is called again and converts everything to the correct types. Inside yourconfigure_optimizers
function you would have to handle the creation of the optimizers for the different stages in training then.