Empirical priors and coverage of posterior credible sets in a sparse normal mean model

12/05/2018
by   Ryan Martin, et al.
0

Bayesian methods provide a natural means for uncertainty quantification, that is, credible sets can be easily obtained from the posterior distribution. But is this uncertainty quantification valid in the sense that the posterior credible sets attain the nominal frequentist coverage probability? This paper investigates the validity of posterior uncertainty quantification based on a class of empirical priors in the sparse normal mean model. We prove that there are scenarios in which the empirical Bayes method provides valid uncertainty quantification while other methods may not, and finite-sample simulations confirm the asymptotic findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/13/2021

Minimum Discrepancy Methods in Uncertainty Quantification

The lectures were prepared for the École Thématique sur les Incertitudes...
research
04/26/2022

PAC-Bayes training for neural networks: sparsity and uncertainty quantification

We study the Gibbs posterior distribution from PAC-Bayes theory for spar...
research
12/09/2020

Uncertainty quantification for fault slip inversion

We propose an efficient Bayesian approach to infer a fault displacement ...
research
11/18/2017

The Bayes Lepski's Method and Credible Bands through Volume of Tubular Neighborhoods

For a general class of priors based on random series basis expansion, we...
research
12/23/2020

Probabilistic Iterative Methods for Linear Systems

This paper presents a probabilistic perspective on iterative methods for...
research
09/13/2019

Learned imaging with constraints and uncertainty quantification

We outline new approaches to incorporate ideas from convolutional networ...
research
04/27/2022

Uncertainty Quantification for nonparametric regression using Empirical Bayesian neural networks

We propose a new, two-step empirical Bayes-type of approach for neural n...

Please sign up or login with your details

Forgot password? Click here to reset