-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test-caffe-r101-fix36.yaml #99
Comments
: Some model parameters or buffers are not found in the checkpoint: |
Hello, I met this problem too, have you solved it?": Some model parameters or buffers are not found in the checkpoint:" |
No, I checked the code and never found out why, as long as the author knows, so I am ready to use the original bottom-up-attention, which works very well. |
Could you help me to solve this problem , When i run the bottom-up-attention generate_tsv.py |
Can you give me a contact information if it is convenient? Of course, if not, thank you very much. QQ or WeChat? |
V:17320069772
…---Original---
From: ***@***.***>
Date: Sun, Dec 11, 2022 16:34 PM
To: ***@***.***>;
Cc: ***@***.******@***.***>;
Subject: Re: [MILVLG/bottom-up-attention.pytorch] test-caffe-r101-fix36.yaml (Issue #99)
Can you give me a contact information if it is convenient? Of course, if not, thank you very much. QQ or WeChat?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
I use the test-caffe-r101-fix36.yaml,but the model generates 16 or 26 bboxes ,not fix 36 ,why?
The text was updated successfully, but these errors were encountered: