This is my TensorFlow implementations of Wasserstein GANs with Gradient Penalty (WGAN-GP) proposed in Improved Training of Wasserstein GANs.The key insight of WGAN-GP is as follows. To enforce Lipschitz constraint in Wasserstein GAN, the original paper proposes to clip the weights of the discriminator (critic), which can lead to undesired behavior including exploding and vanishing gradients. Instead of weight clipping, this paper proposes to employ a gradient penalty term to constrain the gradient norm of the critic’s output with respect to its input, resulting the learning objective:
This enables stable training of a variety of GAN models on a wide range of datasets.- Python 3.7
- Tensorflow 2.4.x
- NumPy
- PIL
- Matplotlib
-
This Notebook uses the Dataset obtained from Kaggle
-
Open up the train.ipynb file. Then make sure to change the variables
PHOTO_PATH = "YOUR DATASET PATH HERE" MODEL_PATH = "MODEL PATH HERE" SAVE_PATH = 'MODEL SAVE PATH HERE'
to point to your desired Path folders.
-
Make sure to give the
EPOCHS
variable a higher number to get a more realistic result -
Now run the notebook
-
Open generate_face.py and change the variable
MODEL_PATH = "YOUR FINAL MODEL PATH"
to point to your desired Model Path.
-
Now run the file using :
python generate_face.py
- Open generate_face.py and change the variable
to point to your desired Model Folder.
DATA_PATH = "MODEL FOLDER"
- Now run the file using :
python generate_video.py