Amortized Bayesian inference for clustering models

11/24/2018
by   Ari Pakman, et al.
0

We develop methods for efficient amortized approximate Bayesian inference over posterior distributions of probabilistic clustering models, such as Dirichlet process mixture models. The approach is based on mapping distributed, symmetry-invariant representations of cluster arrangements into conditional probabilities. The method parallelizes easily, yields iid samples from the approximate posterior of cluster assignments with the same computational cost of a single Gibbs sampler sweep, and can easily be applied to both conjugate and non-conjugate models, as training only requires samples from the generative model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/28/2018

Discrete Neural Processes

Many data generating processes involve latent random variables over disc...
research
12/18/2018

Computational Solutions for Bayesian Inference in Mixture Models

This chapter surveys the most standard Monte Carlo methods available for...
research
03/15/2012

Dirichlet Process Mixtures of Generalized Mallows Models

We present a Dirichlet process mixture model over discrete incomplete ra...
research
08/21/2018

Multinomial Models with Linear Inequality Constraints: Overview and Improvements of Computational Methods for Bayesian Inference

Many psychological theories can be operationalized as linear inequality ...
research
10/24/2019

Finite Mixtures of ERGMs for Ensembles of Networks

Ensembles of networks arise in many scientific fields, but currently the...
research
06/14/2023

Graph-Aligned Random Partition Model (GARP)

Bayesian nonparametric mixtures and random partition models are powerful...
research
03/07/2022

Discovering Inductive Bias with Gibbs Priors: A Diagnostic Tool for Approximate Bayesian Inference

Full Bayesian posteriors are rarely analytically tractable, which is why...

Please sign up or login with your details

Forgot password? Click here to reset