DeepAI AI Chat
Log In Sign Up

Monte Carlo Inference via Greedy Importance Sampling

01/16/2013
by   Dale Schuurmans, et al.
0

We present a new method for conducting Monte Carlo inference in graphical models which combines explicit search with generalized importance sampling. The idea is to reduce the variance of importance sampling by searching for significant points in the target distribution. We prove that it is possible to introduce search and still maintain unbiasedness. We then demonstrate our procedure on a few simple inference tasks and show that it can improve the inference quality of standard MCMC methods, including Gibbs sampling, Metropolis sampling, and Hybrid Monte Carlo. This paper extends previous work which showed how greedy importance sampling could be correctly realized in the one-dimensional case.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 7

page 8

page 9

06/13/2012

AND/OR Importance Sampling

The paper introduces AND/OR importance sampling for probabilistic graphi...
01/21/2019

A weighted Discrepancy Bound of quasi-Monte Carlo Importance Sampling

Importance sampling Monte-Carlo methods are widely used for the approxim...
05/26/2020

Exhaustive Neural Importance Sampling applied to Monte Carlo event generation

The generation of accurate neutrino-nucleus cross-section models needed ...
05/03/2022

Quantifying rare events in spotting: How far do wildfires spread?

Spotting refers to the transport of burning pieces of firebrand by wind ...
07/25/2017

Monte-Carlo acceleration: importance sampling and hybrid dynamic systems

The reliability of a complex industrial system can rarely be assessed an...
06/30/2021

Monte Carlo Variational Auto-Encoders

Variational auto-encoders (VAE) are popular deep latent variable models ...
05/24/2022

A Quadrature Rule combining Control Variates and Adaptive Importance Sampling

Driven by several successful applications such as in stochastic gradient...