Fixed-Form Variational Posterior Approximation through Stochastic Linear Regression

06/28/2012
by   Tim Salimans, et al.
0

We propose a general algorithm for approximating nonstandard Bayesian posterior distributions. The algorithm minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribution. Our method can be used to approximate any posterior distribution, provided that it is given in closed form up to the proportionality constant. The approximation can be any distribution in the exponential family or any mixture of such distributions, which means that it can be made arbitrarily precise. Several examples illustrate the speed and accuracy of our approximation method in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2020

Understanding Variational Inference in Function-Space

Recent work has attempted to directly approximate the `function-space' o...
research
02/05/2019

Asymptotic Consistency of α-Rényi-Approximate Posteriors

In this work, we study consistency properties of α-Rényi approximate pos...
research
12/01/2020

On The Gaussian Approximation To Bayesian Posterior Distributions

The present article derives the minimal number N of observations needed ...
research
05/15/2018

The Hierarchical Adaptive Forgetting Variational Filter

A common problem in Machine Learning and statistics consists in detectin...
research
06/02/2020

Meta Learning as Bayes Risk Minimization

Meta-Learning is a family of methods that use a set of interrelated task...
research
02/19/2019

Scalable Thompson Sampling via Optimal Transport

Thompson sampling (TS) is a class of algorithms for sequential decision-...
research
07/08/2018

BALSON: Bayesian Least Squares Optimization with Nonnegative L1-Norm Constraint

A Bayesian approach termed BAyesian Least Squares Optimization with Nonn...

Please sign up or login with your details

Forgot password? Click here to reset