From GAN to WGAN

04/18/2019
by   Lilian Weng, et al.
32

This paper explains the math behind a generative adversarial network (GAN) model and why it is hard to be trained. Wasserstein GAN is intended to improve GANs' training by adopting a smooth metric for measuring the distance between two probability distributions.

READ FULL TEXT

page 6

page 7

research
03/29/2018

Generative Modeling using the Sliced Wasserstein Distance

Generative Adversarial Nets (GANs) are very successful at modeling distr...
research
09/21/2020

On the Performance of Generative Adversarial Network (GAN) Variants: A Clinical Data Study

Generative Adversarial Network (GAN) is a useful type of Neural Networks...
research
02/22/2018

Solving Approximate Wasserstein GANs to Stationarity

Generative Adversarial Networks (GANs) are one of the most practical str...
research
05/30/2017

The Cramer Distance as a Solution to Biased Wasserstein Gradients

The Wasserstein probability metric has received much attention from the ...
research
06/18/2018

Banach Wasserstein GAN

Wasserstein Generative Adversarial Networks (WGANs) can be used to gener...
research
03/06/2019

Conditional GANs For Painting Generation

We examined the use of modern Generative Adversarial Nets to generate no...
research
02/28/2020

Distributionally Robust Chance Constrained Programming with Generative Adversarial Networks (GANs)

This paper presents a novel deep learning based data-driven optimization...

Please sign up or login with your details

Forgot password? Click here to reset