Multi-Class Uncertainty Calibration via Mutual Information Maximization-based Binning

06/23/2020
by   Kanil Patel, et al.
0

Post-hoc calibration is a common approach for providing high-quality confidence estimates of deep neural network predictions. Recent work has shown that widely used scaling methods underestimate their calibration error, while alternative Histogram Binning (HB) methods with verifiable calibration performance often fail to preserve classification accuracy. In the case of multi-class calibration with a large number of classes K, HB also faces the issue of severe sample-inefficiency due to a large class imbalance resulting from the conversion into K one-vs-rest class-wise calibration problems. The goal of this paper is to resolve the identified issues of HB in order to provide verified and calibrated confidence estimates using only a small holdout calibration dataset for bin optimization while preserving multi-class ranking accuracy. From an information-theoretic perspective, we derive the I-Max concept for binning, which maximizes the mutual information between labels and binned (quantized) logits. This concept mitigates potential loss in ranking performance due to lossy quantization, and by disentangling the optimization of bin edges and representatives allows simultaneous improvement of ranking and calibration performance. In addition, we propose a shared class-wise (sCW) binning strategy that fits a single calibrator on the merged training sets of all K class-wise problems, yielding reliable estimates from a small calibration set. The combination of sCW and I-Max binning outperforms the state of the art calibration methods on various evaluation metrics across different benchmark datasets and models, even when using only a small set of calibration data, e.g. 1k samples for ImageNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2020

Intra Order-preserving Functions for Calibration of Multi-Class Neural Networks

Predicting calibrated confidence scores for multi-class deep networks is...
research
05/10/2021

Meta-Cal: Well-controlled Post-hoc Calibration by Ranking

In many applications, it is desirable that a classifier not only makes a...
research
06/19/2023

Scaling of Class-wise Training Losses for Post-hoc Calibration

The class-wise training losses often diverge as a result of the various ...
research
01/30/2020

Better Multi-class Probability Estimates for Small Data Sets

Many classification applications require accurate probability estimates ...
research
03/16/2020

Mix-n-Match: Ensemble and Compositional Methods for Uncertainty Calibration in Deep Learning

This paper studies the problem of post-hoc calibration of machine learni...
research
02/15/2022

Taking a Step Back with KCal: Multi-Class Kernel-Based Calibration for Deep Neural Networks

Deep neural network (DNN) classifiers are often overconfident, producing...
research
10/07/2022

Class-wise and reduced calibration methods

For many applications of probabilistic classifiers it is important that ...

Please sign up or login with your details

Forgot password? Click here to reset