Posterior covariance information criterion for arbitrary loss functions

06/13/2022
by   Yukito Iba, et al.
0

We propose a novel computationally low-cost method for estimating the predictive risks of Bayesian methods for arbitrary loss functions. The proposed method utilises posterior covariance and provides estimators of the Gibbs and the plugin generalization errors. We present theoretical guarantees of the proposed method, clarifying the connection between the widely applicable information criterion, the Bayesian sensitivity analysis, and the infinitesimal jackknife approximation of Bayesian leave-one-out cross validation. An application to differentially-private learning is also discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2021

Posterior Covariance Information Criterion

We introduce an information criterion, PCIC, for predictive evaluation b...
research
12/24/2020

On Statistical Efficiency in Learning

A central issue of many statistical learning problems is to select an ap...
research
06/11/2022

Mathematical Theory of Bayesian Statistics for Unknown Information Source

In statistical inference, uncertainty is unknown and all models are wron...
research
09/19/2022

Robust leave-one-out cross-validation for high-dimensional Bayesian models

Leave-one-out cross-validation (LOO-CV) is a popular method for estimati...
research
01/03/2023

The E-Posterior

We develop a representation of a decision maker's uncertainty based on e...
research
07/20/2020

Bayesian EWMA and CUSUM Control Charts Under Different Loss Functions

The Exponentially Weighted Moving Average (EWMA) and Cumulative Sum (CUS...
research
05/03/2022

Optimal minimization of the covariance loss

Let X be a random vector valued in ℝ^m such that X_2≤ 1 almost surely. F...

Please sign up or login with your details

Forgot password? Click here to reset