Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check failed: target_blobs.size() == source_layer.blobs_size() (5 vs. 3) Incompatible number of blobs for layer data_bn #27

Open
monocongo opened this issue Sep 3, 2019 · 0 comments

Comments

@monocongo
Copy link

When I load the ResNet10 model and the associated weights (from here) into DIGITS for training on a custom image dataset I get the following output:

ERROR: Check failed: target_blobs.size() == source_layer.blobs_size() (5 vs. 3) Incompatible number of blobs for layer data_bn

layer_64_1_relu2 does not need backward computation.
layer_64_1_scale2 does not need backward computation.
layer_64_1_bn2 does not need backward computation.
layer_64_1_conv1 does not need backward computation.
conv1_pool_conv1_pool_0_split does not need backward computation.
conv1_pool does not need backward computation.
conv1_relu does not need backward computation.
conv1_scale does not need backward computation.
conv1_bn does not need backward computation.
conv1 does not need backward computation.
data_scale does not need backward computation.
data_bn does not need backward computation.
label does not need backward computation.
data does not need backward computation.
This network produces output label
This network produces output prob
Network initialization done.
Solver scaffolding done.
Finetuning from /resnet10_cvgj/resnet10/resnet10_cvgj_iter_320000.caffemodel
Check failed: target_blobs.size() == source_layer.blobs_size() (5 vs. 3) Incompatible number of blobs for layer data_bn   

I have seen similar issues with comments describing this sort of error as indicative of an incompatibility between the model architecture (deploy.prototxt) and the pre-trained model weights.

Can anyone suggest how to resolve or work around this issue? Thanks in advance for any suggestions or insight.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant