Online Variational Filtering and Parameter Learning

10/26/2021
by   Andrew Campbell, et al.
0

We present a variational method for online state estimation and parameter learning in state-space models (SSMs), a ubiquitous class of latent variable models for sequential data. As per standard batch variational techniques, we use stochastic gradients to simultaneously optimize a lower bound on the log evidence with respect to both model parameters and a variational approximation of the states' posterior distribution. However, unlike existing approaches, our method is able to operate in an entirely online manner, such that historic observations do not require revisitation after being incorporated and the cost of updates at each time step remains constant, despite the growing dimensionality of the joint posterior distribution of the states. This is achieved by utilizing backward decompositions of this joint posterior distribution and of its variational approximation, combined with Bellman-type recursions for the evidence lower bound and its gradients. We demonstrate the performance of this methodology across several examples, including high-dimensional SSMs and sequential Variational Auto-Encoders.

READ FULL TEXT
research
01/24/2018

Gaussian variational approximation for high-dimensional state space models

Our article considers variational approximations of the posterior distri...
research
06/27/2012

Variational Bayesian Inference with Stochastic Search

Mean-field variational inference is a method for approximate Bayesian po...
research
10/18/2018

Variational Noise-Contrastive Estimation

Unnormalised latent variable models are a broad and flexible class of st...
research
05/18/2016

Gaussian variational approximation with sparse precision matrices

We consider the problem of learning a Gaussian variational approximation...
research
09/18/2017

Variational Gaussian Approximation for Poisson Data

The Poisson model is frequently employed to describe count data, but in ...
research
08/26/2019

Variationally Inferred Sampling Through a Refined Bound for Probabilistic Programs

A framework to boost efficiency of Bayesian inference in probabilistic p...
research
03/14/2021

A Scalable Gradient-Free Method for Bayesian Experimental Design with Implicit Models

Bayesian experimental design (BED) is to answer the question that how to...

Please sign up or login with your details

Forgot password? Click here to reset