DeepAI AI Chat
Log In Sign Up

Quantifying Aleatoric and Epistemic Uncertainty in Machine Learning: Are Conditional Entropy and Mutual Information Appropriate Measures?

by   Eyke Hüllermeier, et al.
Universität München

This short note is a critical discussion of the quantification of aleatoric and epistemic uncertainty in terms of conditional entropy and mutual information, respectively, which has recently been proposed in machine learning and has become quite common since then. More generally, we question the idea of an additive decomposition of total uncertainty into its aleatoric and epistemic constituents.


page 1

page 2

page 3

page 4


Epistemic Uncertainty Sampling

Various strategies for active learning have been proposed in the machine...

Analytic Mutual Information in Bayesian Neural Networks

Bayesian neural networks have successfully designed and optimized a robu...

Towards calibrated and scalable uncertainty representations for neural networks

For many applications it is critical to know the uncertainty of a neural...

Is the Volume of a Credal Set a Good Measure for Epistemic Uncertainty?

Adequate uncertainty representation and quantification have become imper...

Quantifying Aleatoric and Epistemic Uncertainty Using Density Estimation in Latent Space

The distribution of a neural network's latent representations has been s...

A Quantitative Comparison of Epistemic Uncertainty Maps Applied to Multi-Class Segmentation

Uncertainty assessment has gained rapid interest in medical image analys...

A Survey on Epistemic (Model) Uncertainty in Supervised Learning: Recent Advances and Applications

Quantifying the uncertainty of supervised learning models plays an impor...