Scalable MCMC for Mixed Membership Stochastic Blockmodels

10/16/2015
by   Wenzhe Li, et al.
0

We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2019

Stochastic gradient Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the...
research
09/27/2018

Fast and Scalable Position-Based Layout Synthesis

The arrangement of objects into a layout can be challenging for non-expe...
research
07/19/2021

Structured Stochastic Gradient MCMC

Stochastic gradient Markov chain Monte Carlo (SGMCMC) is considered the ...
research
12/10/2015

Scalable Modeling of Conversational-role based Self-presentation Characteristics in Large Online Forums

Online discussion forums are complex webs of overlapping subcommunities ...
research
06/15/2015

A Complete Recipe for Stochastic Gradient MCMC

Many recent Markov chain Monte Carlo (MCMC) samplers leverage continuous...
research
09/28/2020

Variational Temporal Deep Generative Model for Radar HRRP Target Recognition

We develop a recurrent gamma belief network (rGBN) for radar automatic t...
research
12/18/2022

Pigeonhole Stochastic Gradient Langevin Dynamics for Large Crossed Mixed Effects Models

Large crossed mixed effects models with imbalanced structures and missin...

Please sign up or login with your details

Forgot password? Click here to reset