DeepAI AI Chat
Log In Sign Up

Statistical and Topological Properties of Sliced Probability Divergences

03/12/2020
by   Kimia Nadjahi, et al.
0

The idea of slicing divergences has been proven to be successful when comparing two probability measures in various machine learning applications including generative modeling, and consists in computing the expected value of a `base divergence' between one-dimensional random projections of the two measures. However, the computational and statistical consequences of such a technique have not yet been well-established. In this paper, we aim at bridging this gap and derive some properties of sliced divergence functions. First, we show that slicing preserves the metric axioms and the weak continuity of the divergence, implying that the sliced divergence will share similar topological properties. We then precise the results in the case where the base divergence belongs to the class of integral probability metrics. On the other hand, we establish that, under mild conditions, the sample complexity of the sliced divergence does not depend on the dimension, even when the base divergence suffers from the curse of dimensionality. We finally apply our general results to the Wasserstein distance and Sinkhorn divergences, and illustrate our theory on both synthetic and real data experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/12/2018

Divergence measures estimation and its asymptotic normality theory : Discrete case

In this paper we provide the asymptotic theory of the general phi-diverg...
05/30/2017

The Cramer Distance as a Solution to Biased Wasserstein Gradients

The Wasserstein probability metric has received much attention from the ...
07/14/2012

Scaling of Model Approximation Errors and Expected Entropy Distances

We compute the expected value of the Kullback-Leibler divergence to vari...
01/03/2020

Statistical Detection of Collective Data Fraud

Statistical divergence is widely applied in multimedia processing, basic...
06/15/2021

Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral

The spectacular success of deep generative models calls for quantitative...
02/05/2021

Estimating 2-Sinkhorn Divergence between Gaussian Processes from Finite-Dimensional Marginals

Optimal Transport (OT) has emerged as an important computational tool in...
11/11/2020

(f,Γ)-Divergences: Interpolating between f-Divergences and Integral Probability Metrics

We develop a general framework for constructing new information-theoreti...