Bernstein-von Mises theorems and uncertainty quantification for linear inverse problems

11/09/2018
by   Matteo Giordano, et al.
0

We consider the statistical inverse problem of approximating an unknown function f from a linear measurement corrupted by additive Gaussian white noise. We employ a nonparametric Bayesian approach with standard Gaussian priors, for which the posterior-based reconstruction of f corresponds to a Tikhonov regulariser f̅ with a Cameron-Martin space norm penalty. We prove a semiparametric Bernstein-von Mises theorem for a large collection of linear functionals of f, implying that semiparametric posterior estimation and uncertainty quantification are valid and optimal from a frequentist point of view. The result is illustrated and further developed for some examples both in mildly and severely ill-posed cases. For the problem of recovering the source function in elliptic partial differential equations, we also obtain a nonparametric version of the theorem that entails the convergence of the posterior distribution to a fixed infinite-dimensional Gaussian probability measure with minimal covariance in suitable function spaces. As a consequence, we show that the distribution of the Tikhonov regulariser f̅ is asymptotically normal and attains the information lower bound, and that credible sets centred at f̅ have correct frequentist coverage and optimal diameter.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2020

Statistical guarantees for Bayesian uncertainty quantification in non-linear inverse problems with Gaussian process priors

Bayesian inference and uncertainty quantification in a general class of ...
research
01/06/2022

Analyticity and sparsity in uncertainty quantification for PDEs with Gaussian random field inputs

We establish summability results for coefficient sequences of Wiener-Her...
research
10/11/2021

Optional Pólya trees: posterior rates and uncertainty quantification

We consider statistical inference in the density estimation model using ...
research
05/30/2022

Infinite-dimensional optimization and Bayesian nonparametric learning of stochastic differential equations

The paper has two major themes. The first part of the paper establishes ...
research
12/08/2022

Minimizers of the Onsager-Machlup functional are strong posterior modes

In this work we connect two notions: That of the nonparametric mode of a...
research
02/26/2020

Uncertainty Quantification for Sparse Deep Learning

Deep learning methods continue to have a decided impact on machine learn...
research
10/09/2020

Regularising linear inverse problems under unknown non-Gaussian white noise

We deal with the solution of a generic linear inverse problem in the Hil...

Please sign up or login with your details

Forgot password? Click here to reset