Importance Weighting and Variational Inference

08/27/2018
by   Justin Domke, et al.
0

Recent work used importance sampling ideas for better variational bounds on likelihoods. We clarify the applicability of these ideas to pure probabilistic inference, by showing the resulting Importance Weighted Variational Inference (IWVI) technique is an instance of augmented variational inference, thus identifying the looseness in previous work. Experiments confirm IWVI's practicality for probabilistic inference. As a second contribution, we investigate inference with elliptical distributions, which improves accuracy in low dimensions, and convergence in high dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2018

Importance Weighting and Varational Inference

Recent work used importance sampling ideas for better variational bounds...
research
10/02/2022

GFlowNets and variational inference

This paper builds bridges between two families of probabilistic algorith...
research
02/07/2018

Yes, but Did It Work?: Evaluating Variational Inference

While it's always possible to compute a variational approximation to a p...
research
10/31/2019

Energy-Inspired Models: Learning with Sampler-Induced Distributions

Energy-based models (EBMs) are powerful probabilistic models, but suffer...
research
03/22/2022

Self-Supervised Representation Learning as Multimodal Variational Inference

This paper proposes a probabilistic extension of SimSiam, a recent self-...
research
02/22/2022

Multiple Importance Sampling ELBO and Deep Ensembles of Variational Approximations

In variational inference (VI), the marginal log-likelihood is estimated ...
research
03/13/2019

Variational Bayesian Optimal Experimental Design

Bayesian optimal experimental design (BOED) is a principled framework fo...

Please sign up or login with your details

Forgot password? Click here to reset