Relaxed Wasserstein with Applications to GANs

05/19/2017
by   Xin Guo, et al.
0

We propose a novel class of statistical divergences called Relaxed Wasserstein (RW) divergence. RW divergence generalizes Wasserstein distance and is parametrized by strictly convex, differentiable functions. We establish for RW several key probabilistic properties, which are critical for the success of Wasserstein distances. In particular, we show that RW is dominated by Total Variation (TV) and Wasserstein-L^2 distance, and establish continuity, differentiability, and duality representation of RW divergence. Finally, we provide a non-asymptotic moment estimate and a concentration inequality for RW divergence. Our experiments on image generation problems show that RWGANs with Kullback-Leibler (KL) divergence provide competitive performance compared with many state-of-the-art approaches. Empirically, we show that RWGANs possess better convergence properties than WGANs, with competitive inception scores. In comparison to the existing literature in GANs, which are ad-hoc in the choices of cost functions, this new conceptual framework not only provides great flexibility in designing general cost functions, e.g., for applications to GANs, but also allows different cost functions implemented and compared under a unified mathematical framework.

READ FULL TEXT
research
10/22/2019

Bridging the Gap Between f-GANs and Wasserstein GANs

Generative adversarial networks (GANs) have enjoyed much success in lear...
research
05/23/2017

Ambiguity set and learning via Bregman and Wasserstein

Construction of ambiguity set in robust optimization relies on the choic...
research
10/22/2018

The Bregman chord divergence

Distances are fundamental primitives whose choice significantly impacts ...
research
05/30/2019

Convergence of Smoothed Empirical Measures with Applications to Entropy Estimation

This paper studies convergence of empirical measures smoothed by a Gauss...
research
12/07/2020

Sobolev Wasserstein GAN

Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) dua...
research
06/10/2020

Optimal Bounds between f-Divergences and Integral Probability Metrics

The families of f-divergences (e.g. the Kullback-Leibler divergence) and...
research
03/12/2020

Statistical and Topological Properties of Sliced Probability Divergences

The idea of slicing divergences has been proven to be successful when co...

Please sign up or login with your details

Forgot password? Click here to reset