This repository has been archived by the owner on Nov 16, 2023. It is now read-only.
[FEATURE] save optimizer and amp state into checkpoint #562
Labels
enhancement
New feature or request
Description
Currently, in the common.py for transformer models, a checkpoint only saves model state and the optimizer and amp state info is not saved. We can consider saving this info like in
https://github.com/NVIDIA/apex#checkpointing
Expected behavior with the suggested feature
Other Comments
The text was updated successfully, but these errors were encountered: