Sampling by Divergence Minimization

05/02/2021
by   Ameer Dharamshi, et al.
0

We introduce a family of Markov Chain Monte Carlo (MCMC) methods designed to sample from target distributions with irregular geometry using an adaptive scheme. In cases where targets exhibit non-Gaussian behaviour, we propose that adaption should be regional in nature as opposed to global. Our algorithms minimize the information projection side of the Kullback-Leibler (KL) divergence between the proposal distribution class and the target to encourage proposals distributed similarly to the regional geometry of the target. Unlike traditional adaptive MCMC, this procedure rapidly adapts to the geometry of the current position as it explores the space without the need for a large batch of samples. We extend this approach to multimodal targets by introducing a heavily tempered chain to enable faster mixing between regions of interest. The divergence minimization algorithms are tested on target distributions with multiple irregularly shaped modes and we provide results demonstrating the effectiveness of our methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2019

Plateau Proposal Distributions for Adaptive Component-wise Multiple-Try Metropolis

Markov chain Monte Carlo (MCMC) methods are sampling methods that have b...
research
10/11/2015

Kernel Sequential Monte Carlo

We propose kernel sequential Monte Carlo (KSMC), a framework for samplin...
research
07/16/2021

Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods

Normalizing flows can generate complex target distributions and thus sho...
research
03/20/2020

Multiple projection MCMC algorithms on submanifolds

We propose new Markov Chain Monte Carlo algorithms to sample probability...
research
10/25/2021

Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals

Markov Chain Monte Carlo (MCMC) methods are a powerful tool for computat...
research
07/19/2013

Kernel Adaptive Metropolis-Hastings

A Kernel Adaptive Metropolis-Hastings algorithm is introduced, for the p...
research
06/08/2023

Entropy-based Training Methods for Scalable Neural Implicit Sampler

Efficiently sampling from un-normalized target distributions is a fundam...

Please sign up or login with your details

Forgot password? Click here to reset