Revisiting the balance heuristic for estimating normalising constants

by   Felipe J Medina-Aguayo, et al.

Multiple importance sampling estimators are widely used for computing intractable constants due to its reliability and robustness. The celebrated balance heuristic estimator belongs to this class of methods and has proved very successful in computer graphics. The basic ingredients for computing the estimator are: a set of proposal distributions, indexed by some discrete label, and a predetermined number of draws from each of these proposals. However, if the number of available proposals is much larger than the number of permitted importance points, one needs to select, possibly at random, which of these distributions will be used. The focus of this work lies within the previous context, exploring some improvements and variations of the balance heuristic via a novel extended-space representation of the estimator, leading to straightforward annealing schemes for variance reduction purposes. In addition, we also look at the intractable scenario where the proposal density is only available as a joint function with the discrete label, as may be encountered in problems where an ordering is imposed. For this case, we look at combinations of correlated unbiased estimators which also fit into the extended-space representation and, in turn, will provide other interesting solutions.


page 1

page 2

page 3

page 4


Generalizing the Balance Heuristic Estimator in Multiple Importance Sampling

In this paper, we propose a novel and generic family of multiple importa...

Variance Analysis of Multiple Importance Sampling Schemes

Multiple importance sampling (MIS) is an increasingly used methodology w...

Selection of proposal distributions for generalized importance sampling estimators

The standard importance sampling (IS) method uses samples from a single ...

Optimality in Noisy Importance Sampling

In this work, we analyze the noisy importance sampling (IS), i.e., IS wo...

Advances in Importance Sampling

Importance sampling (IS) is a Monte Carlo technique for the approximatio...

On a Metropolis-Hastings importance sampling estimator

A classical approach for approximating expectations of functions w.r.t. ...

A New Unbiased and Efficient Class of LSH-Based Samplers and Estimators for Partition Function Computation in Log-Linear Models

Log-linear models are arguably the most successful class of graphical mo...

Please sign up or login with your details

Forgot password? Click here to reset