Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could you add some comments in the code? #1

Open
PengboLiu opened this issue Feb 20, 2020 · 1 comment
Open

Could you add some comments in the code? #1

PengboLiu opened this issue Feb 20, 2020 · 1 comment

Comments

@PengboLiu
Copy link

It's hard to understand the code, including bash shell.

@zhuchen03
Copy link
Owner

zhuchen03 commented Feb 20, 2020

Hi Pengbo,

Sorry for not writing enough comments. I just added comments to the hyperparemters used in fairseq-RoBERTa/launch/FreeLB/mnli-fp32-clip.sh and huggingface-transformers/launch/run_glue.sh, so that you can read the code starting from these scripts...

fairseq is more convolved, but I think it should be much easier to read the code of Huggingface's transformers. The algorithm is all included in huggingface-transformers/examples/run_glue_freelb.py, plus some modification for the dropout mask in the ALBERT model. fairseq includes our implementations for FreeAT and YOPO, but will take more time to read.

I will add more comments to the code soon!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants