Quantifying Aleatoric and Epistemic Uncertainty in Machine Learning: Are Conditional Entropy and Mutual Information Appropriate Measures?

09/07/2022
by   Eyke Hüllermeier, et al.
0

This short note is a critical discussion of the quantification of aleatoric and epistemic uncertainty in terms of conditional entropy and mutual information, respectively, which has recently been proposed in machine learning and has become quite common since then. More generally, we question the idea of an additive decomposition of total uncertainty into its aleatoric and epistemic constituents.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2019

Epistemic Uncertainty Sampling

Various strategies for active learning have been proposed in the machine...
research
01/24/2022

Analytic Mutual Information in Bayesian Neural Networks

Bayesian neural networks have successfully designed and optimized a robu...
research
10/28/2019

Towards calibrated and scalable uncertainty representations for neural networks

For many applications it is critical to know the uncertainty of a neural...
research
06/16/2023

Is the Volume of a Credal Set a Good Measure for Epistemic Uncertainty?

Adequate uncertainty representation and quantification have become imper...
research
12/05/2020

Quantifying Aleatoric and Epistemic Uncertainty Using Density Estimation in Latent Space

The distribution of a neural network's latent representations has been s...
research
09/22/2021

A Quantitative Comparison of Epistemic Uncertainty Maps Applied to Multi-Class Segmentation

Uncertainty assessment has gained rapid interest in medical image analys...
research
11/03/2021

A Survey on Epistemic (Model) Uncertainty in Supervised Learning: Recent Advances and Applications

Quantifying the uncertainty of supervised learning models plays an impor...

Please sign up or login with your details

Forgot password? Click here to reset