Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bcnn_finetuning.py run error #9

Open
Hedlen opened this issue Aug 12, 2017 · 7 comments
Open

bcnn_finetuning.py run error #9

Hedlen opened this issue Aug 12, 2017 · 7 comments

Comments

@Hedlen
Copy link

Hedlen commented Aug 12, 2017

When I run the bcnn_finetuning.py,and the following error occurs.My category is 134 classes.

InvalidArgumentError (see above for traceback): logits and labels must be same size: logits_size=[4,134] labels_size=[16,134] [[Node: SoftmaxCrossEntropyWithLogitsSoftmaxCrossEntropyWithLogits[T=DT_FLOAT, _device="/job:loc alhost/replica:0/task:0/gpu:0"](Reshape_2, Reshape_3)]]
logits_size and labels_size are not equal.
I don't know why this happens. I'm running exactly as you follow your code.
Please help me solve this problem.
Thank you very much!

@Hedlen
Copy link
Author

Hedlen commented Aug 14, 2017

I have understood, I was wrong. Thank you very much.

@rnsandeep
Copy link

Same Issue with me. I changed images to 224 instead of 448.
Resolved it by changing this line in bcnn_finetuning.py
218. self.conv5_3 = tf.reshape(self.conv5_3,[-1,512,784]) ''' Reshape conv5_3 from [batch_size, number_of_filters, height*width]
to self.conv5_3 = tf.reshape(self.conv5_3,[-1,512,196]) # 784 depends on the size of image i guess. so for 224, 196 is correct number.
I think instead of hard coding 784 better to get it from tensors.

@Hedlen
Copy link
Author

Hedlen commented Aug 30, 2017

@rnsandeep ,Thank you for your replay. I will follow your advice. Thanks again!

@YanShuo1992
Copy link

@rnsandeep @Hedlen Hi, I also meet the same issue and your answers solve it. However, I still have two questions:

  1. Do you I need to change the line 232: self.phi_I = tf.divide(self.phi_I,784.0) ?
  2. How to calculate the correct value if I use the images in other size?

Cheers

@rnsandeep
Copy link

rnsandeep commented Nov 17, 2017

@YanShuo1992 I can tell for multiples of 224. if you make your image 112 i.e. 224/2 it should be 49( 196/4). if you make it 448 (224x2) it should be 196*4( 784). Not sure about every size.

@YanShuo1992
Copy link

@rnsandeep my image is 720720. I find you can output the shape of the conv5_3. It is (?, 512,14,14) when the image size is 224. You can simply store the shape of the conv5_3 as s and replace the 196 with ss.

@jiayugedede
Copy link

In bcnn_finetuning.py--> for i in range(total_val_batch): <-- have occured error.It's undefine name.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants