Skip to content

A PyTorch implementation of AdaMatch: A Unified Approach to Semi-Supervised Learning and Domain Adaptation

License

Notifications You must be signed in to change notification settings

zysymu/AdaMatch-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AdaMatch-pytorch

A PyTorch implementation of AdaMatch: A Unified Approach to Semi-Supervised Learning and Domain Adaptation.

This implementation is heavily based off of Sayak Paul's excellent keras blog post. I also used Ang Yi Zhe's PyTorch implementation to fix some problems that my initial implementation had.

How to run?

You can run this code simply by doing:

python run.py

Note that this implementation is to be used more as a starting point, and you'll have to dig a little deeper on the python code in order to change hyperparameters, transforms and the data that is being loaded.

Running the the code on a MNIST -> USPS adapation using a ResNet18 architecture:

  • Accuracy on source dataset = 0.9822
  • Accuracy on target dataset = 0.9561

Some differences:

  • Network used: due to limited computing resources my implementation uses a ResNet18 architecture. This could be easily changed by importing other models on the network.py file.
  • The paper uses CTAugment for strong augmentations. I implemented a pipeline similar to CTAugment using the availabe torchvision transforms.
  • The original implementation uses a cosine decay for the learning rate scheduler. I tried using PyTorch's cosine-related learning rate schedulers and didn't manage to achieve good results, so I used a simple step learning rate scheduler.

If you notice anything that looks out of the ordinary feel free to open an issue. Suggestions are very much appreciated! :)

About

A PyTorch implementation of AdaMatch: A Unified Approach to Semi-Supervised Learning and Domain Adaptation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages