You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I apply the fine_pruning method in ResNet50, I find that I have to redefine the last bn layer with code nn.BatchNorm2d(pruning_mask.shape[0] - num_pruned) and then load the bn's params data in the way of https://github.com/VinAIResearch/Warping-based_Backdoor_Attack-release/blob/main/defenses/fine_pruning/fine-pruning-celeba.py#L150](url). Otherwise, the output of redefined last conv layer doesn't match the dimension of input of the last bn layer. Finally before using net_pruned in the eval function, I used net_pruned.eval() to fix params of the redefined last bn layer.(The Resnet50 which I used is torchvision.models.resnet50(), so the dimension of the concrete layer may be different, but I think redefining bn layer perhaps is also needed in your code)
The text was updated successfully, but these errors were encountered:
Thanks for your answers. For the first question before, I find that I didn't download the latest version of the code. But I still have some questions about https://github.com/VinAIResearch/Warping-based_Backdoor_Attack-release/tree/main/defenses/fine_pruning/fine-pruning-celeba.py
nn.BatchNorm2d(pruning_mask.shape[0] - num_pruned)
and then load the bn's params data in the way of https://github.com/VinAIResearch/Warping-based_Backdoor_Attack-release/blob/main/defenses/fine_pruning/fine-pruning-celeba.py#L150](url). Otherwise, the output of redefined last conv layer doesn't match the dimension of input of the last bn layer. Finally before using net_pruned in the eval function, I usednet_pruned.eval()
to fix params of the redefined last bn layer.(The Resnet50 which I used istorchvision.models.resnet50()
, so the dimension of the concrete layer may be different, but I think redefining bn layer perhaps is also needed in your code)The text was updated successfully, but these errors were encountered: