Revisiting One-vs-All Classifiers for Predictive Uncertainty and Out-of-Distribution Detection in Neural Networks

07/10/2020
by   Shreyas Padhy, et al.
20

Accurate estimation of predictive uncertainty in modern neural networks is critical to achieve well calibrated predictions and detect out-of-distribution (OOD) inputs. The most promising approaches have been predominantly focused on improving model uncertainty (e.g. deep ensembles and Bayesian neural networks) and post-processing techniques for OOD detection (e.g. ODIN and Mahalanobis distance). However, there has been relatively little investigation into how the parametrization of the probabilities in discriminative classifiers affects the uncertainty estimates, and the dominant method, softmax cross-entropy, results in misleadingly high confidences on OOD data and under covariate shift. We investigate alternative ways of formulating probabilities using (1) a one-vs-all formulation to capture the notion of "none of the above", and (2) a distance-based logit representation to encode uncertainty as a function of distance to the training manifold. We show that one-vs-all formulations can improve calibration on image classification tasks, while matching the predictive performance of softmax without incurring any additional training or test-time complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2020

Being Bayesian about Categorical Probability

Neural networks utilize the softmax as a building block in classificatio...
research
06/28/2022

SLOVA: Uncertainty Estimation Using Single Label One-Vs-All Classifier

Deep neural networks present impressive performance, yet they cannot rel...
research
07/17/2022

Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors

As we move away from the data, the predictive uncertainty should increas...
research
07/27/2021

Energy-Based Open-World Uncertainty Modeling for Confidence Calibration

Confidence calibration is of great importance to the reliability of deci...
research
02/13/2023

Density-Softmax: Scalable and Distance-Aware Uncertainty Estimation under Distribution Shifts

Prevalent deep learning models suffer from significant over-confidence u...
research
12/09/2020

Know Your Limits: Monotonicity Softmax Make Neural Classifiers Overconfident on OOD Data

A crucial requirement for reliable deployment of deep learning models fo...
research
10/18/2022

Uncertainty estimation for out-of-distribution detection in computational histopathology

In computational histopathology algorithms now outperform humans on a ra...

Please sign up or login with your details

Forgot password? Click here to reset