Quantifying Aleatoric and Epistemic Uncertainty in Machine Learning: Are Conditional Entropy and Mutual Information Appropriate Measures?
This short note is a critical discussion of the quantification of aleatoric and epistemic uncertainty in terms of conditional entropy and mutual information, respectively, which has recently been proposed in machine learning and has become quite common since then. More generally, we question the idea of an additive decomposition of total uncertainty into its aleatoric and epistemic constituents.
READ FULL TEXT