DeepAI AI Chat
Log In Sign Up

Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty

02/23/2021
by   Jishnu Mukhoti, et al.
6

We show that a single softmax neural net with minimal changes can beat the uncertainty predictions of Deep Ensembles and other more complex single-forward-pass uncertainty approaches. Softmax neural nets cannot capture epistemic uncertainty reliably because for OoD points they extrapolate arbitrarily and suffer from feature collapse. This results in arbitrary softmax entropies for OoD points which can have high entropy, low, or anything in between. We study why, and show that with the right inductive biases, softmax neural nets trained with maximum likelihood reliably capture epistemic uncertainty through the feature-space density. This density is obtained using Gaussian Discriminant Analysis, but it cannot disentangle uncertainties. We show that it is necessary to combine this density with the softmax entropy to disentangle aleatoric and epistemic uncertainty – crucial e.g. for active learning. We examine the quality of epistemic uncertainty on active learning and OoD detection, where we obtain SOTA  0.98 AUROC on CIFAR-10 vs SVHN.

READ FULL TEXT
08/31/2019

Epistemic Uncertainty Sampling

Various strategies for active learning have been proposed in the machine...
02/02/2023

Normalizing Flow Ensembles for Rich Aleatoric and Epistemic Uncertainty Modeling

In this work, we demonstrate how to reliably estimate epistemic uncertai...
06/09/2021

Understanding Softmax Confidence and Uncertainty

It is often remarked that neural networks fail to increase their uncerta...
02/13/2023

Density-Softmax: Scalable and Distance-Aware Uncertainty Estimation under Distribution Shifts

Prevalent deep learning models suffer from significant over-confidence u...
03/04/2020

Simple and Scalable Epistemic Uncertainty Estimation Using a Single Deep Deterministic Neural Network

We propose a method for training a deterministic deep model that can fin...
08/06/2012

Credal nets under epistemic irrelevance

We present a new approach to credal nets, which are graphical models tha...
12/09/2020

Know Your Limits: Monotonicity Softmax Make Neural Classifiers Overconfident on OOD Data

A crucial requirement for reliable deployment of deep learning models fo...

Code Repositories

DDU

Code for Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty


view repo

ddu_dirty_mnist

Dirty-MNIST dataset introduced in "Deterministic Neural Networks with Inductive Biases Capture Epistemic and Aleatoric Uncertainty" (https://arxiv.org/abs/2102.11582)


view repo