Code and pretrained models for paper: Data-Free Adversarial Distillation
-
Updated
Nov 28, 2022 - Python
Code and pretrained models for paper: Data-Free Adversarial Distillation
[IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation
Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.
[ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
[ICCV 2023] "TRM-UAP: Enhancing the Transferability of Data-Free Universal Adversarial Perturbation via Truncated Ratio Maximization", Yiran Liu, Xin Feng, Yunlong Wang, Wu Yang, Di Ming*
Add a description, image, and links to the data-free topic page so that developers can more easily learn about it.
To associate your repository with the data-free topic, visit your repo's landing page and select "manage topics."