Learning Model Reparametrizations: Implicit Variational Inference by Fitting MCMC distributions

by   Michalis K. Titsias, et al.

We introduce a new algorithm for approximate inference that combines reparametrization, Markov chain Monte Carlo and variational methods. We construct a very flexible implicit variational distribution synthesized by an arbitrary Markov chain Monte Carlo operation and a deterministic transformation that can be optimized using the reparametrization trick. Unlike current methods for implicit variational inference, our method avoids the computation of log density ratios and therefore it is easily applicable to arbitrary continuous and differentiable models. We demonstrate the proposed algorithm for fitting banana-shaped distributions and for training variational autoencoders.


MetFlow: A New Efficient Method for Bridging the Gap between Markov Chain Monte Carlo and Variational Inference

In this contribution, we propose a new computationally efficient method ...

Stochastic Variational Inference for GARCH Models

Stochastic variational inference algorithms are derived for fitting vari...

Implicit copula variational inference

Key to effective generic, or "black-box", variational inference is the s...

Toward Unlimited Self-Learning Monte Carlo with Annealing Process Using VAE's Implicit Isometricity

Self-learning Monte Carlo (SLMC) methods are recently proposed to accele...

Learning variational autoencoders via MCMC speed measures

Variational autoencoders (VAEs) are popular likelihood-based generative ...

The Theory and Algorithm of Ergodic Inference

Approximate inference algorithm is one of the fundamental research field...

Geometric variational inference

Efficiently accessing the information contained in non-linear and high d...

Please sign up or login with your details

Forgot password? Click here to reset