Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetuning #51

Open
asunaperisi opened this issue Dec 21, 2023 · 0 comments
Open

Finetuning #51

asunaperisi opened this issue Dec 21, 2023 · 0 comments

Comments

@asunaperisi
Copy link

It is great job.Thank you to share. I just want to ask how can I fine-tune your model with my own dataset? I saw you already added vicap.

I am using this code for fine-tuning and same vocab file that you provide.Naturally some of the tokens of my dataset is not included at vocab file. First I thought I can simply add the tokens to the vocab file. But there is a parameter named vocab_size. I updated this parameter properly. When I try to use your per-trained model, I am getting size mismatch error. Is there any way for fine-tuning without retrain the entire model.
Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant