Skip to content

BatchNormalization.lua:87: got 32-feature tensor, expected 0 #1327

Closed
@emredog

Description

@emredog

Hello,

I have a model that makes use of the nn.SpatialBatchNormalization, and I got the error in the title.

I want to make inference on a single image (3x224x224). I've tried both "unsqueezed" (1x3x224x224) and the original, both lead to the same error.

The relevant part of the network is as follows:

  (1): nn.SpatialConvolution(3 -> 32, 3x3, 1,1, 1,1) without bias
  (2): nn.SpatialBatchNormalization (4D) (0)

I've added the following prints to the checkInput function that throws the error:

function BN:checkInputDim(input)
   local iDim = input:dim()
   assert(iDim == self.nDim or
              (iDim == self.nDim - 1 and self.train == false), string.format(
      'only mini-batch supported (%dD tensor), got %dD tensor instead',
      self.nDim, iDim))
   local featDim = (iDim == self.nDim - 1) and 1 or 2
   print("train mode?", self.train)
   print("featDim", featDim)
   print("runningMean:nElement", self.running_mean:nElement())
   print("input size:", input:size())
   assert(input:size(featDim) == self.running_mean:nElement(), string.format(
      'got %d-feature tensor, expected %d',
      input:size(featDim), self.running_mean:nElement()))
end

this yields the following output for the unsqueezed image:

train mode?	false	
featDim	2	
runningMean:nElement	0	
input size:	   1
  32
 224
 224
[torch.LongStorage of size 4]

Any help or lead to what to try next is much appreciated, thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions