Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predict simplekt with custom seq_len #191

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

kheedogg
Copy link

model_name: simplekt, emb_type: qid
  File "/workspace/pykt-toolkit/examples/wandb_predict.py", line 145, in <module>
    main(params)
  File "/workspace/pykt-toolkit/examples/wandb_predict.py", line 67, in main
    model = load_model(model_name, model_config, data_config, emb_type, save_dir)
  File "/usr/local/lib/python3.10/dist-packages/pykt_toolkit-0.0.38-py3.10.egg/pykt/models/init_model.py", line 125, in load_model
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 2153, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for simpleKT:
        size mismatch for model.position_emb.weight: copying a param with shape torch.Size([1, 512, 256]) from checkpoint, the shape in current model is torch.Size([1, 200, 256]).

When i try to run examples/wandb_predict.py for simplekt, i got an error message like above.
I trained the model with different max_seq_len and here is the solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant