Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for multiple optimizers #700

Open
linshokaku opened this issue May 30, 2023 · 1 comment
Open

Support for multiple optimizers #700

linshokaku opened this issue May 30, 2023 · 1 comment
Labels
cat:enhancement New feature or request prio:low

Comments

@linshokaku
Copy link
Member

linshokaku commented May 30, 2023

Want to be able to apply different optimizers to each weight of a single model.

example:

class MyModel(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.l1 = nn.Linear(20, 15)
        self.l2 = nn.Linear(15, 10)

        with torch.no_grad():
            self.l1.weight.copy_(torch.ones((15, 20)))
            self.l1.bias.copy_(torch.ones((15,)))
            self.l2.weight.copy_(torch.ones((10, 15)))
            self.l2.bias.copy_(torch.ones((10,)))

    def forward(self, x):
        y = F.relu(self.l1(x))
        y = self.l2(y)
        return y

model = MyModel()
adam_optimizer = torch.optim.Adam(model.l1.parameters(), lr=0.1)
sgd_optimizer = torch.optim.SGD(model.l2.parameters(), lr=0.1)

trainer = ppe.engine.create_trainer(
    model_with_loss, optimizer???, epochs,
    device=device,
)
@linshokaku linshokaku added the cat:enhancement New feature or request label May 30, 2023
@linshokaku
Copy link
Member Author

linshokaku commented May 30, 2023

The current CodeBlockLogic can support multiple optimizers, but it has lost the ability to specify the target optimizer by model_name, which is incompatible with Logic.

list(optimizers.values()),

In Logic, only one can be specified by model_name, so I think it needs to be modified.
optimizers[self.model_name].zero_grad()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:enhancement New feature or request prio:low
Projects
None yet
Development

No branches or pull requests

2 participants