Markov Chain Importance Sampling - a highly efficient estimator for MCMC

05/18/2018
by   Ingmar Schuster, et al.
0

Markov chain algorithms are ubiquitous in machine learning and statistics and many other disciplines. In this work we present a novel estimator applicable to several classes of Markov chains, dubbed Markov chain importance sampling (MCIS). For a broad class of Metropolis-Hastings algorithms, MCIS efficiently makes use of rejected proposals. For discretized Langevin diffusions, it provides a novel way of correcting the discretization error. Our estimator satisfies a central limit theorem and improves on error per CPU cycle, often to a large extent. As a by-product it enables estimating the normalizing constant, an important quantity in Bayesian machine learning and statistics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2018

On a Metropolis-Hastings importance sampling estimator

A classical approach for approximating expectations of functions w.r.t. ...
research
07/17/2022

The Importance Markov Chain

The Importance Markov chain is a new algorithm bridging the gap between ...
research
05/17/2023

Stein Π-Importance Sampling

Stein discrepancies have emerged as a powerful tool for retrospective im...
research
03/02/2022

Understanding the Sources of Error in MBAR through Asymptotic Analysis

Multiple sampling strategies commonly used in molecular dynamics, such a...
research
01/25/2020

The reproducing Stein kernel approach for post-hoc corrected sampling

Stein importance sampling is a widely applicable technique based on kern...
research
01/16/2013

Stochastic Logic Programs: Sampling, Inference and Applications

Algorithms for exact and approximate inference in stochastic logic progr...
research
06/24/2021

Three rates of convergence or separation via U-statistics in a dependent framework

Despite the ubiquity of U-statistics in modern Probability and Statistic...

Please sign up or login with your details

Forgot password? Click here to reset