Pull Message Passing for Nonparametric Belief Propagation

07/27/2018
by   Karthik Desingh, et al.
4

We present a "pull" approach to approximate products of Gaussian mixtures within message updates for Nonparametric Belief Propagation (NBP) inference. Existing NBP methods often represent messages between continuous-valued latent variables as Gaussian mixture models. To avoid computational intractability in loopy graphs, NBP necessitates an approximation of the product of such mixtures. Sampling-based product approximations have shown effectiveness for NBP inference. However, such approximations used within the traditional "push" message update procedures quickly become computationally prohibitive for multi-modal distributions over high-dimensional variables. In contrast, we propose a "pull" method, as the Pull Message Passing for Nonparametric Belief propagation (PMPNBP) algorithm, and demonstrate its viability for efficient inference. We report results using an experiment from an existing NBP method, PAMPAS, for inferring the pose of an articulated structure in clutter. Results from this illustrative problem found PMPNBP has a greater ability to efficiently scale the number of components in its mixtures and, consequently, improve inference accuracy.

READ FULL TEXT

page 6

page 7

page 8

research
10/05/2016

Lifted Message Passing for the Generalized Belief Propagation

We introduce the lifted Generalized Belief Propagation (GBP) message pas...
research
10/01/2007

High-Order Nonparametric Belief-Propagation for Fast Image Inpainting

In this paper, we use belief-propagation techniques to develop fast algo...
research
12/10/2018

Factored Pose Estimation of Articulated Objects using Efficient Nonparametric Belief Propagation

Robots working in human environments often encounter a wide range of art...
research
07/05/2021

A visual introduction to Gaussian Belief Propagation

In this article, we present a visual introduction to Gaussian Belief Pro...
research
03/08/2023

DNBP: Differentiable Nonparametric Belief Propagation

We present a differentiable approach to learn the probabilistic factors ...
research
12/23/2013

Using Latent Binary Variables for Online Reconstruction of Large Scale Systems

We propose a probabilistic graphical model realizing a minimal encoding ...
research
10/01/2021

ALBU: An approximate Loopy Belief message passing algorithm for LDA to improve performance on small data sets

Variational Bayes (VB) applied to latent Dirichlet allocation (LDA) has ...

Please sign up or login with your details

Forgot password? Click here to reset