Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

02/03/2020
by   Ziv Goldfeld, et al.
0

The 1-Wasserstein distance (W_1) is a popular proximity measure between probability distributions. Its metric structure, robustness to support mismatch, and rich geometric structure fueled its wide adoption for machine learning tasks. Such tasks inherently rely on approximating distributions from data. This surfaces a central issue – empirical approximation under Wasserstein distances suffers from the curse of dimensionality, converging at rate n^-1/d where n is the sample size and d is the data dimension; this rate drastically deteriorates in high dimensions. To circumvent this impasse, we adopt the framework of Gaussian smoothed Wasserstein distance W_1^(σ), where both probability measures are convolved with an isotropic Gaussian distribution of parameter σ > 0. In remarkable contrast to classic W_1, the empirical convergence rate under W_1^(σ) is n^-1/2 in all dimensions. Inspired by this fact, the present paper conducts an in-depth study of the statistical properties of the smooth Wasserstein distance. We derive the limit distribution of √(n)W_1^(σ)(P_n,P) for all d, where P_n is the empirical measure of n independent observations from P. In arbitrary dimension, the limit is characterized as the supremum of a tight Gaussian process indexed by 1-Lipschitz functions convolved with a Gaussian density. Building on this result we derive concentration inequalities, bootstrap consistency, and explore generative modeling with W_1^(σ) under the minimum distance estimation framework. For the latter, we derive measurability, almost sure convergence, and limit distributions for optimal generative models and their corresponding smooth Wasserstein error. These results promote the smooth Wasserstein distance as a powerful tool for statistical inference in high dimensions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset