Sample-dependent Adaptive Temperature Scaling for Improved Calibration

07/13/2022
by   Tom Joy, et al.
10

It is now well known that neural networks can be wrong with high confidence in their predictions, leading to poor calibration. The most common post-hoc approach to compensate for this is to perform temperature scaling, which adjusts the confidences of the predictions on any input by scaling the logits by a fixed value. Whilst this approach typically improves the average calibration across the whole test dataset, this improvement typically reduces the individual confidences of the predictions irrespective of whether the classification of a given input is correct or incorrect. With this insight, we base our method on the observation that different samples contribute to the calibration error by varying amounts, with some needing to increase their confidence and others needing to decrease it. Therefore, for each input, we propose to predict a different temperature value, allowing us to adjust the mismatch between confidence and accuracy at a finer granularity. Furthermore, we observe improved results on OOD detection and can also extract a notion of hardness for the data-points. Our method is applied post-hoc, consequently using very little computation time and with a negligible memory footprint and is applied to off-the-shelf pre-trained classifiers. We test our method on the ResNet50 and WideResNet28-10 architectures using the CIFAR10/100 and Tiny-ImageNet datasets, showing that producing per-data-point temperatures is beneficial also for the expected calibration error across the whole test set. Code is available at: https://github.com/thwjoy/adats.

READ FULL TEXT
research
02/24/2021

Parameterized Temperature Scaling for Boosting the Expressive Power in Post-Hoc Uncertainty Calibration

We address the problem of uncertainty calibration and introduce a novel ...
research
05/25/2022

Revisiting Calibration for Question Answering

Model calibration aims to adjust (calibrate) models' confidence so that ...
research
07/31/2022

Adaptive Temperature Scaling for Robust Calibration of Deep Neural Networks

In this paper, we study the post-hoc calibration of modern neural networ...
research
10/08/2021

Temperature as Uncertainty in Contrastive Learning

Contrastive learning has demonstrated great capability to learn represen...
research
06/23/2022

A Geometric Method for Improved Uncertainty Estimation in Real-time

Machine learning classifiers are probabilistic in nature, and thus inevi...
research
03/09/2023

Adaptive Calibrator Ensemble for Model Calibration under Distribution Shift

Model calibration usually requires optimizing some parameters (e.g., tem...

Please sign up or login with your details

Forgot password? Click here to reset