Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Args for reproduce the fine-tuned model #3

Open
forest1988 opened this issue Jan 24, 2021 · 0 comments
Open

Args for reproduce the fine-tuned model #3

forest1988 opened this issue Jan 24, 2021 · 0 comments

Comments

@forest1988
Copy link

Hello,

In README.md, you wrote that the training steps are 40000 at most.
However, when I tried to reproduce the fine-tuned model you kindly provide, it seems the training steps are too many if I run the script at the default setting (100.0 epochs are set as default for run_union.py).

The initial checkpoint of BERT can be downloaded from bert. We use the uncased base version of BERT (about 110M parameters). We train the model for 40000 steps at most. The training process will task about 1~2 days.

After the 100-epoch training, I got model.ckpt-1414000.

If you don't mind, could you please tell me the appropriate args to be used for reproducing your fine-tuned UNION models used in the paper?
Is it enough to change the training epochs?

I'm sorry if there are already details somewhere.

Thank you in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant