Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request for the AdeMAMix optimizer. #1058

Open
mathDR opened this issue Sep 13, 2024 · 4 comments · May be fixed by #1104
Open

Feature request for the AdeMAMix optimizer. #1058

mathDR opened this issue Sep 13, 2024 · 4 comments · May be fixed by #1104

Comments

@mathDR
Copy link

mathDR commented Sep 13, 2024

The AdeMAMix optimizer is a simple modification of
the Adam optimizer with a mixture of two EMAs to better take advantage of past gradients.

The paper has optax skeleton code which I could contribute if the maintainers deem this a good fit for the repo.

@clementpoiret
Copy link

Hey, any update on it since the PR has been closed? :) thanks !

@mathDR
Copy link
Author

mathDR commented Oct 13, 2024

Apologies. I closed this because It was just easier to "begin again". I have a draft PR locally that I was going to push this week.

@clementpoiret
Copy link

Awesome, good to know! Thx for your work 👌

@mathDR mathDR linked a pull request Oct 14, 2024 that will close this issue
@mathDR
Copy link
Author

mathDR commented Oct 14, 2024

Okay PR here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants