Approximate Bayesian Computation with the Sliced-Wasserstein Distance

10/28/2019
by   Kimia Nadjahi, et al.
0

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in generative models with intractable but easy-to-sample likelihood. It constructs an approximate posterior distribution by finding parameters for which the simulated data are close to the observations in terms of summary statistics. These statistics are defined beforehand and might induce a loss of information, which has been shown to deteriorate the quality of the approximation. To overcome this problem, Wasserstein-ABC has been recently proposed, and compares the datasets via the Wasserstein distance between their empirical distributions, but does not scale well to the dimension or the number of samples. We propose a new ABC technique, called Sliced-Wasserstein ABC and based on the Sliced-Wasserstein distance, which has better computational and statistical properties. We derive two theoretical results showing the asymptotical consistency of our approach, and we illustrate its advantages on synthetic data and an image denoising task.

READ FULL TEXT
research
05/09/2019

Approximate Bayesian computation with the Wasserstein distance

A growing number of generative statistical models do not permit the nume...
research
03/02/2021

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

Wasserstein GANs are based on the idea of minimising the Wasserstein dis...
research
07/08/2020

Approximate Bayesian Computations to fit and compare insurance loss models

Approximate Bayesian Computation (ABC) is a statistical learning techniq...
research
11/22/2021

Approximate Bayesian Computation via Classification

Approximate Bayesian Computation (ABC) enables statistical inference in ...
research
12/11/2020

Randomised Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

We propose a hybrid resampling method to approximate finitely supported ...
research
08/15/2019

Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers

In this paper, we propose a new method to build fair Neural-Network clas...
research
10/23/2020

The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

The prior distribution is a crucial building block in Bayesian analysis,...

Please sign up or login with your details

Forgot password? Click here to reset