Bayesian inference in decomposable graphical models using sequential Monte Carlo methods

05/31/2018
by   Jimmy Olsson, et al.
0

InthisstudywepresentasequentialsamplingmethodologyforBayesian inference in decomposable graphical models. We recast the problem of graph estimation, which in general lacks natural sequential interpretation, into a sequential setting. Specifically, we propose a recursive Feynman-Kac model which generates a flow of junction tree distributions over a space of increasing dimensions and develop an efficient sequential Monte Carlo sampler. As a key ingredient of the proposal kernel in our sampler we use the Christmas tree algorithm developed in the companion paper Olsson et al. [2018]. We focus on particle MCMC methods, in particular particle Gibbs (PG) as it allows for generating MCMC chains with global moves on an underlying space of decomposable graphs. To further improve the algorithm mixing properties of this PG, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing in particular that the refreshment step improves the algorithm performance in terms of asymptotic variance of the estimated distribution. Performance accuracy of the graph estimators are illustrated through a collection of numerical examples demonstrating the feasibility of the suggested approach in both discrete and continuous graphical models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/03/2014

Particle Gibbs with Ancestor Sampling

Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combini...
research
06/02/2018

Sequential sampling of junction trees for decomposable graphs

The junction tree representation provides an attractive structural prope...
research
01/08/2019

Graphical model inference: Sequential Monte Carlo meets deterministic approximations

Approximate inference in probabilistic graphical models (PGMs) can be gr...
research
06/18/2018

Stability of Conditional Sequential Monte Carlo

The particle Gibbs (PG) sampler is a Markov Chain Monte Carlo (MCMC) alg...
research
07/21/2023

Adaptively switching between a particle marginal Metropolis-Hastings and a particle Gibbs kernel in SMC^2

Sequential Monte Carlo squared (SMC^2; Chopin et al., 2012) methods can ...
research
04/01/2019

Fully-Asynchronous Distributed Metropolis Sampler with Optimal Speedup

The Metropolis-Hastings algorithm is a fundamental Markov chain Monte Ca...
research
11/04/2020

Waste-free Sequential Monte Carlo

A standard way to move particles in a SMC sampler is to apply several st...

Please sign up or login with your details

Forgot password? Click here to reset