Lower Bounds on the Bayesian Risk via Information Measures

03/22/2023
by   Amedeo Roberto Esposito, et al.
0

This paper focuses on parameter estimation and introduces a new method for lower bounding the Bayesian risk. The method allows for the use of virtually any information measure, including Rényi's α, φ-Divergences, and Sibson's α-Mutual Information. The approach considers divergences as functionals of measures and exploits the duality between spaces of measures and spaces of functions. In particular, we show that one can lower bound the risk with any information measure by upper bounding its dual via Markov's inequality. We are thus able to provide estimator-independent impossibility results thanks to the Data-Processing Inequalities that divergences satisfy. The results are then applied to settings of interest involving both discrete and continuous parameters, including the “Hide-and-Seek” problem, and compared to the state-of-the-art techniques. An important observation is that the behaviour of the lower bound in the number of samples is influenced by the choice of the information measure. We leverage this by introducing a new divergence inspired by the “Hockey-Stick” Divergence, which is demonstrated empirically to provide the largest lower-bound across all considered settings. If the observations are subject to privatisation, stronger impossibility results can be obtained via Strong Data-Processing Inequalities. The paper also discusses some generalisations and alternative directions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2022

Lower-bounds on the Bayesian Risk in Estimation Procedures via f-Divergences

We consider the problem of parameter estimation in a Bayesian setting an...
research
02/08/2022

On Sibson's α-Mutual Information

We explore a family of information measures that stems from Rényi's α-Di...
research
04/26/2023

Fundamental Tradeoffs in Learning with Prior Information

We seek to understand fundamental tradeoffs between the accuracy of prio...
research
02/08/2022

From Generalisation Error to Transportation-cost Inequalities and Back

In this work, we connect the problem of bounding the expected generalisa...
research
05/10/2021

Neural Computation of Capacity Region of Memoryless Multiple Access Channels

This paper provides a numerical framework for computing the achievable r...
research
07/25/2022

Information Processing Equalities and the Information-Risk Bridge

We introduce two new classes of measures of information for statistical ...
research
12/24/2020

Majorizing Measures for the Optimizer

The theory of majorizing measures, extensively developed by Fernique, Ta...

Please sign up or login with your details

Forgot password? Click here to reset