Convergence rates for optimised adaptive importance samplers

03/28/2019
by   Ömer Deniz Akyıldız, et al.
0

Adaptive importance samplers are adaptive Monte Carlo algorithms to estimate expectations with respect to some target distribution which adapt themselves to obtain better estimators over iterations. Although it is straightforward to show that they have the same theoretical guarantees with importance sampling with respect to the sample size, their behaviour over the number of iterations has been relatively left unexplored despite these adaptive algorithms aim at improving the proposal quality over time. In this work, we explore an adaptation strategy based on convex optimisation which leads to a class of adaptive importance samplers, termed optimised adaptive importance samplers (OAIS). These samplers rely on an adaptation idea based on minimizing the χ^2-divergence between an exponential family proposal and the target. The analysed algorithms are closely related to the adaptive importance samplers which minimise the variance of the weight function. We first prove non-asymptotic error bounds for the mean squared errors (MSEs) of these algorithms, which explicitly depend on the number of iterations and the number of particles together. The non-asymptotic bounds derived in this paper imply that when the target is from the exponential family, the L_2 errors of the optimised samplers converge to the perfect Monte Carlo sampling error O(1/√(N)). We also show that when the target is not from the exponential family, the asymptotic error rate is O(√(ρ^/N)) where ρ^ is the minimum χ^2-divergence between the target and an exponential family proposal.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2022

Global convergence of optimized adaptive importance samplers

We analyze the optimized adaptive importance sampler (OAIS) for performi...
research
07/18/2023

Adaptively Optimised Adaptive Importance Samplers

We introduce a new class of adaptive importance samplers leveraging adap...
research
05/31/2018

Robust Covariance Adaptation in Adaptive Importance Sampling

Importance sampling (IS) is a Monte Carlo methodology that allows for ap...
research
08/30/2021

A principled stopping rule for importance sampling

Importance sampling (IS) is a Monte Carlo technique that relies on weigh...
research
10/19/2022

Gradient-based Adaptive Importance Samplers

Importance sampling (IS) is a powerful Monte Carlo methodology for the a...
research
06/07/2018

Importance weighted generative networks

Deep generative networks can simulate from a complex target distribution...
research
12/08/2022

Entropy minimizing distributions are worst-case optimal importance proposals

Importance sampling of target probability distributions belonging to a g...

Please sign up or login with your details

Forgot password? Click here to reset