Discrete flow posteriors for variational inference in discrete dynamical systems

05/28/2018
by   Laurence Aitchison, et al.
0

Each training step for a variational autoencoder (VAE) requires us to sample from the approximate posterior, so we usually choose simple (e.g. factorised) approximate posteriors in which sampling is an efficient computation that fully exploits GPU parallelism. However, such simple approximate posteriors are often insufficient, as they eliminate statistical dependencies in the posterior. While it is possible to use normalizing flow approximate posteriors for continuous latents, some problems have discrete latents and strong statistical dependencies. The most natural approach to model these dependencies is an autoregressive distribution, but sampling from such distributions is inherently sequential and thus slow. We develop a fast, parallel sampling procedure for autoregressive distributions based on fixed-point iterations which enables efficient and accurate variational inference in discrete state-space latent variable dynamical systems. To optimize the variational bound, we considered two ways to evaluate probabilities: inserting the relaxed samples directly into the pmf for the discrete distribution, or converting to continuous logistic latent variables and interpreting the K-step fixed-point iterations as a normalizing flow. We found that converting to continuous latent variables gave considerable additional scope for mismatch between the true and approximate posteriors, which resulted in biased inferences, we thus used the former approach. Using our fast sampling procedure, we were able to realize the benefits of correlated posteriors, including accurate uncertainty estimates for one cell, and accurate connectivity estimates for multiple cells, in an order of magnitude less time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2018

Variational Rejection Sampling

Learning latent variable models with stochastic variational inference is...
research
06/28/2020

Mixture of Discrete Normalizing Flows for Variational Inference

Advances in gradient-based inference have made distributional approximat...
research
02/23/2020

Predictive Sampling with Forecasting Autoregressive Models

Autoregressive models (ARMs) currently hold state-of-the-art performance...
research
06/15/2016

Improving Variational Inference with Inverse Autoregressive Flow

The framework of normalizing flows provides a general strategy for flexi...
research
05/20/2019

Leveraging Bayesian Analysis To Improve Accuracy of Approximate Models

We focus on improving the accuracy of an approximate model of a multisca...
research
07/10/2020

Self-Reflective Variational Autoencoder

The Variational Autoencoder (VAE) is a powerful framework for learning p...
research
09/28/2018

Variational Bayesian Inference for Audio-Visual Tracking of Multiple Speakers

In this paper we address the problem of tracking multiple speakers via t...

Please sign up or login with your details

Forgot password? Click here to reset