Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

08/10/2020
by   Martin Eigel, et al.
0

Transport maps have become a popular mechanic to express complicated probability densities using sample propagation through an optimized push-forward. Beside their broad applicability and well-known success, transport maps suffer from several drawbacks such as numerical inaccuracies induced by the optimization process and the fact that sampling schemes have to be employed when quantities of interest, e.g. moments are to compute. This paper presents a novel method for the accurate functional approximation of probability density functions (PDF) that copes with those issues. By interpreting the pull-back result of a target PDF through an inexact transport map as a perturbed reference density, a subsequent functional representation in a more accessible format allows for efficient and more accurate computation of the desired quantities. We introduce a layer-based approximation of the perturbed reference density in an appropriate coordinate system to split the high-dimensional representation problem into a set of independent approximations for which separately chosen orthonormal basis functions are available. This effectively motivates the notion of h- and p-refinement (i.e. “mesh size” and polynomial degree) for the approximation of high-dimensional PDFs. To circumvent the curse of dimensionality and enable sampling-free access to certain quantities of interest, a low-rank reconstruction in the tensor train format is employed via the Variational Monte Carlo method. An a priori convergence analysis of the developed approach is derived in terms of Hellinger distance and the Kullback-Leibler divergence. Applications comprising Bayesian inverse problems and several degrees of concentrated densities illuminate the (superior) convergence in comparison to Monte Carlo and Markov-Chain Monte Carlo methods.

READ FULL TEXT
research
03/07/2022

Convergence Speed and Approximation Accuracy of Numerical MCMC

When implementing Markov Chain Monte Carlo (MCMC) algorithms, perturbati...
research
11/22/2021

Bayesian Inversion of Log-normal Eikonal Equations

We study the Bayesian inverse problem for inferring the log-normal slown...
research
03/05/2023

Self-reinforced polynomial approximation methods for concentrated probability densities

Transport map methods offer a powerful statistical learning tool that ca...
research
10/02/2018

Approximation and sampling of multivariate probability distributions in the tensor train decomposition

General multivariate distributions are notoriously expensive to sample f...
research
07/14/2020

Deep Composition of Tensor Trains using Squared Inverse Rosenblatt Transports

Characterising intractable high-dimensional random variables is one of t...
research
06/12/2020

Sparse approximation of triangular transports on bounded domains

Let ρ and π be two probability measures on [-1,1]^d with positive and an...
research
11/13/2021

Computing f-Divergences and Distances of High-Dimensional Probability Density Functions – Low-Rank Tensor Approximations

Very often, in the course of uncertainty quantification tasks or data an...

Please sign up or login with your details

Forgot password? Click here to reset