Skip to content

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning, IEEE Transactions on Knowledge and Data Engineering 2024

Notifications You must be signed in to change notification settings

Seonghak35/R2KD

Repository files navigation

R2KD

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning

Seonghak Kim, Gyeongdo Ham, Yucheol Cho, Daeshik Kim

This provides an implementation of the code for "Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning", as published in the IEEE Transactions on Knowledge and Data Engineering.

Installation

Environments:

  • Python 3.8
  • PyTorch 1.10.0
  • torchvision 0.11.0

Install the package:

pip install -r requirements.txt
python setup.py develop

Getting started

  1. Evaluation
  • You can see r2kd_eval.sh.

    # augmentation_type = ["mixup", "cutmix", "cutout", "autoaug", "cutmixpick"]
    python tools/eval.py -d tiny_imagenet -m wrn_16_2 -c ./best_results/augmentation_type/wrn402_wrn162_student_best
    python tools/eval.py -d tiny_imagenet -m resnet20 -c ./best_results/augmentation_type/res56_res20_student_best
    python tools/eval.py -d tiny_imagenet -m resnet8x4 -c ./best_results/augmentation_type/res32x4_res8x4_student_best
    python tools/eval.py -d tiny_imagenet -m vgg8 -c ./best_results/augmentation_type/vgg13_vgg8_student_best
    python tools/eval.py -d tiny_imagenet -m MobileNetV2 -c ./best_results/augmentation_type/vgg13_mv2_student_best
    python tools/eval.py -d tiny_imagenet -m ShuffleV2 -c ./best_results/augmentation_type/res32x4_shuv2_student_best
  1. Training
  • The weights of teacher models can be downloaded via pretrained_models.sh.

    sh pretrained_models.sh
  • Using these weights, you can train the student models with R2KD, as shown in r2kd_train.sh.

    # augmentation_type = ["mixup", "cutmix", "cutout", "autoaug", "cutmixpick"]
    python tools/train.py --cfg configs/tiny/r2kd/wrn40_2_wrn16_2.yaml --pruning -a augmentation_type
    python tools/train.py --cfg configs/tiny/r2kd/res56_res20.yaml --pruning -a augmentation_type
    python tools/train.py --cfg configs/tiny/r2kd/res32x4_res8x4.yaml --pruning -a augmentation_type
    python tools/train.py --cfg configs/tiny/r2kd/vgg13_vgg8.yaml --pruning -a augmentation_type
    python tools/train.py --cfg configs/tiny/r2kd/vgg13_mv2.yaml --pruning -a augmentation_type
    python tools/train.py --cfg configs/tiny/r2kd/res32x4_shuv2.yaml --pruning -a augmentation_type

Citation

Please consider citing R2KD in your publications if it helps your research.

@article{kim2024robustness,
  title={Robustness-reinforced knowledge distillation with correlation distance and network pruning},
  author={Kim, Seonghak and Ham, Gyeongdo and Cho, Yucheol and Kim, Daeshik},
  journal={IEEE Transactions on Knowledge and Data Engineering},
  year={2024},
  publisher={IEEE}
}

Acknowledgement

This code is built on mdistiller and Multi-Level-Logit-Distillation.

Thanks to the contributors of mdistiller and Multi-Level-Logit-Distillation for their exceptional efforts.

About

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning, IEEE Transactions on Knowledge and Data Engineering 2024

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published