Certified Dimension Reduction for Bayesian Updating with the Cross-Entropy Method

06/07/2022
by   Max Ehre, et al.
0

In inverse problems, the parameters of a model are estimated based on observations of the model response. The Bayesian approach is powerful for solving such problems; one formulates a prior distribution for the parameter state that is updated with the observations to compute the posterior parameter distribution. Solving for the posterior distribution can be challenging when, e.g., prior and posterior significantly differ from one another and/or the parameter space is high-dimensional. We use a sequence of importance sampling measures that arise by tempering the likelihood to approach inverse problems exhibiting a significant distance between prior and posterior. Each importance sampling measure is identified by cross-entropy minimization as proposed in the context of Bayesian inverse problems in Engel et al. (2021). To efficiently address problems with high-dimensional parameter spaces we set up the minimization procedure in a low-dimensional subspace of the original parameter space. The principal idea is to analyse the spectrum of the second-moment matrix of the gradient of the log-likelihood function to identify a suitable subspace. Following Zahm et al. (2021), an upper bound on the Kullback-Leibler-divergence between full-dimensional and subspace posterior is provided, which can be utilized to determine the effective dimension of the inverse problem corresponding to a prescribed approximation error bound. We suggest heuristic criteria for optimally selecting the number of model and model gradient evaluations in each iteration of the importance sampling sequence. We investigate the performance of this approach using examples from engineering mechanics set in various parameter space dimensions.

READ FULL TEXT
research
06/09/2020

Cross-entropy-based importance sampling with failure-informed dimension reduction for rare event simulation

The estimation of rare event or failure probabilities in high dimensions...
research
07/02/2018

Certified dimension reduction in nonlinear Bayesian inverse problems

We propose a dimension reduction technique for Bayesian inverse problems...
research
02/26/2021

Data-Free Likelihood-Informed Dimension Reduction of Bayesian Inverse Problems

Identifying a low-dimensional informed parameter subspace offers a viabl...
research
03/03/2019

Scalable optimization-based sampling on function space

Optimization-based samplers provide an efficient and parallellizable app...
research
09/08/2017

Likelihood informed dimension reduction for inverse problems in remote sensing of atmospheric constituent profiles

We use likelihood informed dimension reduction (LIS) (T. Cui et al. 2014...
research
02/22/2022

A gradient-free subspace-adjusting ensemble sampler for infinite-dimensional Bayesian inverse problems

Sampling of sharp posteriors in high dimensions is a challenging problem...

Please sign up or login with your details

Forgot password? Click here to reset