DeepAI AI Chat
Log In Sign Up

Stochastic Stein Discrepancies

07/06/2020
by   Jackson Gorham, et al.
0

Stein discrepancies (SDs) monitor convergence and non-convergence in approximate inference when exact integration and sampling are intractable. However, the computation of a Stein discrepancy can be prohibitive if the Stein operator - often a sum over likelihood terms or potentials - is expensive to evaluate. To address this deficiency, we show that stochastic Stein discrepancies (SSDs) based on subsampled approximations of the Stein operator inherit the convergence control properties of standard SDs with probability 1. In our experiments with biased Markov chain Monte Carlo (MCMC) hyperparameter tuning, approximate MCMC sampler selection, and stochastic Stein variational gradient descent, SSDs deliver comparable inferences to standard SDs with orders of magnitude fewer likelihood evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/10/2018

Bayesian variational inference for exponential random graph models

Bayesian inference for exponential random graphs (ERGMs) is a doubly int...
06/03/2015

Parallel Stochastic Gradient Markov Chain Monte Carlo for Matrix Factorisation Models

For large matrix factorisation problems, we develop a distributed Markov...
08/20/2023

An Exact Sampler for Inference after Polyhedral Model Selection

Inference after model selection presents computational challenges when d...
04/19/2022

A stochastic Stein Variational Newton method

Stein variational gradient descent (SVGD) is a general-purpose optimizat...
10/24/2020

Lightning-Fast Gravitational Wave Parameter Inference through Neural Amortization

Gravitational waves from compact binaries measured by the LIGO and Virgo...
10/16/2015

SGD with Variance Reduction beyond Empirical Risk Minimization

We introduce a doubly stochastic proximal gradient algorithm for optimiz...