Learning Model Reparametrizations: Implicit Variational Inference by Fitting MCMC distributions

08/04/2017
by   Michalis K. Titsias, et al.
0

We introduce a new algorithm for approximate inference that combines reparametrization, Markov chain Monte Carlo and variational methods. We construct a very flexible implicit variational distribution synthesized by an arbitrary Markov chain Monte Carlo operation and a deterministic transformation that can be optimized using the reparametrization trick. Unlike current methods for implicit variational inference, our method avoids the computation of log density ratios and therefore it is easily applicable to arbitrary continuous and differentiable models. We demonstrate the proposed algorithm for fitting banana-shaped distributions and for training variational autoencoders.

READ FULL TEXT
research
02/27/2020

MetFlow: A New Efficient Method for Bridging the Gap between Markov Chain Monte Carlo and Variational Inference

In this contribution, we propose a new computationally efficient method ...
research
08/29/2023

Stochastic Variational Inference for GARCH Models

Stochastic variational inference algorithms are derived for fitting vari...
research
11/18/2021

Implicit copula variational inference

Key to effective generic, or "black-box", variational inference is the s...
research
11/25/2022

Toward Unlimited Self-Learning Monte Carlo with Annealing Process Using VAE's Implicit Isometricity

Self-learning Monte Carlo (SLMC) methods are recently proposed to accele...
research
08/26/2023

Learning variational autoencoders via MCMC speed measures

Variational autoencoders (VAEs) are popular likelihood-based generative ...
research
11/17/2018

The Theory and Algorithm of Ergodic Inference

Approximate inference algorithm is one of the fundamental research field...
research
05/21/2021

Geometric variational inference

Efficiently accessing the information contained in non-linear and high d...

Please sign up or login with your details

Forgot password? Click here to reset