-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cost of generative model not reducing, #7
Comments
I'm having the same issue. |
I too have this issue. A cheap workaround is to change the Data Distribution to self.mu = -2 or something like this. |
@MalteFlender How is that even a workaround let alone cheap? How can one change the original data distribution in any scenario? It's like saying my generator is not creating MNIST samples and is instead creating random noise, so I'd just drop MNIST dataset and instead say that my original dataset itself is noise and so my generator works! 😮 |
@aashish-kumar , @yuluntian , @MalteFlender , if you guys are still interested, I found the same diagram as in @aashish-kumar 's video in this paper -> "MANY PATHS TO EQUILIBRIUM: GANS DO NOT NEED TO DECREASE A DIVERGENCE AT EVERY STEP" [see Fig 2. (page no. 5)]. I am currently reading this paper. You guys can check it out if interested. |
Wasserstein GAN is the solution to the problem ... https://www.alexirpan.com/2017/02/22/wasserstein-gan.html |
The cost of discriminate model converges fast to low value and then no perceivable change to generative model. Can you please suggest what the issue might be.
this.mp4.tar.gz
The text was updated successfully, but these errors were encountered: