Herded Gibbs Sampling

01/17/2013
by   Luke Bornn, et al.
0

The Gibbs sampler is one of the most popular algorithms for inference in statistical models. In this paper, we introduce a herding variant of this algorithm, called herded Gibbs, that is entirely deterministic. We prove that herded Gibbs has an O(1/T) convergence rate for models with independent variables and for fully connected probabilistic graphical models. Herded Gibbs is shown to outperform Gibbs in the tasks of image denoising with MRFs and named entity recognition with CRFs. However, the convergence for herded Gibbs for sparsely connected probabilistic graphical models is still an open problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/21/2019

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees

Gibbs sampling is a Markov chain Monte Carlo method that is often used f...
research
02/19/2018

Heron Inference for Bayesian Graphical Models

Bayesian graphical models have been shown to be a powerful tool for disc...
research
11/21/2016

Probabilistic Duality for Parallel Gibbs Sampling without Graph Coloring

We present a new notion of probabilistic duality for random variables in...
research
08/21/2017

Neural Block Sampling

Efficient Monte Carlo inference often requires manual construction of mo...
research
12/12/2012

Loopy Belief Propogation and Gibbs Measures

We address the question of convergence in the loopy belief propagation (...
research
06/23/2023

On a Class of Gibbs Sampling over Networks

We consider the sampling problem from a composite distribution whose pot...
research
10/25/2019

A Gibbs sampler for a class of random convex polytopes

We present a Gibbs sampler to implement the Dempster-Shafer (DS) theory ...

Please sign up or login with your details

Forgot password? Click here to reset