Skip to content
Discussion options

You must be logged in to vote

Hi @Godofnothing ,

I am afraid there is no clean way to do so. What I personally do to hack around this (if I need it) is that I have arguments to the model that define what is to be returned by configure_optimizers and then call self.trainer.accelerator.setup_optimizers(self.trainer) in any of the other hooks. That makes sure configure_optimizers is called again and converts everything to the correct types. Inside your configure_optimizers function you would have to handle the creation of the optimizers for the different stages in training then.

Replies: 2 comments 4 replies

Comment options

You must be logged in to vote
4 replies
@tchaton
Comment options

@alvaro-budria
Comment options

@alvaro-budria
Comment options

@adosar
Comment options

Answer selected by Godofnothing
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment