We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
能否提供一下超参数的配置文件? 另外,打包中的bert_config_tiny.json这个文件似乎不是run_finetuning.py的hparam的参数文件
The text was updated successfully, but these errors were encountered:
bert_config_tiny.json 是 discriminator的参数配置文件(为了与同样规模的roberta-tiny对比,gen是disc的1/4) finetuning的时候 直接使用 PyCLUE 包(使用官方bert源码,scope=‘electra’)以及 把 官方代码里面的layer-wise learning rate decay 加入 对应的optimier即可
Sorry, something went wrong.
yyht
No branches or pull requests
能否提供一下超参数的配置文件?
另外,打包中的bert_config_tiny.json这个文件似乎不是run_finetuning.py的hparam的参数文件
The text was updated successfully, but these errors were encountered: