A Multi-armed Bandit MCMC, with applications in sampling from doubly intractable posterior

03/13/2019
by   Wang Guanyang, et al.
0

Markov chain Monte Carlo (MCMC) algorithms are widely used to sample from complicated distributions, especially to sample from the posterior distribution in Bayesian inference. However, MCMC is not directly applicable when facing the doubly intractable problem. In this paper, we discussed and compared two existing solutions -- Pseudo-marginal Monte Carlo and Exchange Algorithm. This paper also proposes a novel algorithm: Multi-armed Bandit MCMC (MABMC), which chooses between two (or more) randomized acceptance ratios in each step. MABMC could be applied directly to incorporate Pseudo-marginal Monte Carlo and Exchange algorithm, with higher average acceptance probability.

READ FULL TEXT

page 20

page 21

research
07/08/2016

Pseudo-Marginal Hamiltonian Monte Carlo

Bayesian inference in the presence of an intractable likelihood function...
research
05/27/2021

Stochastic Gradient MCMC with Multi-Armed Bandit Tuning

Stochastic gradient Markov chain Monte Carlo (SGMCMC) is a popular class...
research
02/23/2020

On Thompson Sampling with Langevin Algorithms

Thompson sampling is a methodology for multi-armed bandit problems that ...
research
09/03/2018

Image Segmentation with Pseudo-marginal MCMC Sampling and Nonparametric Shape Priors

In this paper, we propose an efficient pseudo-marginal Markov chain Mont...
research
09/22/2017

Barker's algorithm for Bayesian inference with intractable likelihoods

In this expository paper we abstract and describe a simple MCMC scheme f...
research
06/30/2015

Scalable Discrete Sampling as a Multi-Armed Bandit Problem

Drawing a sample from a discrete distribution is one of the building com...
research
06/09/2015

Variational consensus Monte Carlo

Practitioners of Bayesian statistics have long depended on Markov chain ...

Please sign up or login with your details

Forgot password? Click here to reset