We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问该项目是对所有网络层进行微调,还是仅对部分网络层微调呢?
The text was updated successfully, but these errors were encountered:
基于作者给的训练集和训练脚本,我训练完成后,发现模型的通用聊天能力下降,请问有什么比较好的办法避免这个问题吗? 也就是说,如何在保证原有ChatGLM能力的基础上,加上新的领域能力?
Sorry, something went wrong.
用Lora的方式,应该会好一些。
No branches or pull requests
请问该项目是对所有网络层进行微调,还是仅对部分网络层微调呢?
The text was updated successfully, but these errors were encountered: