-
Notifications
You must be signed in to change notification settings - Fork 533
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lora微调后的模型需要进行合并吗 #885
Comments
你好, 需要合并权重的,可以参考教程的merging-lora-weights部分。 |
再请教您一下,merge是必须的吗?看起来merge只是将lora权重融合到原始模型中方便推理,如果不merge,理论上应该也能正常出结果吧? |
您好,感谢您的回复。我尝试使用教程里的方法进行merge,但是效果依旧不佳。具体来说,我在原有模型的基础上加入了分类头实现分类。数据有6w+条。微调快结束时模型输出的分类概率以及真实标签例如: |
我微调后的模型似乎没有学习到知识。需要和原始模型的权重进行合并吗?微调的脚本是:internvl_chat/shell/internvl2.5/2nd_finetune/internvl2_5_2b_dynamic_res_2nd_finetune_lora.sh
The text was updated successfully, but these errors were encountered: