Calibrating Deep Neural Networks using Focal Loss

02/21/2020
by   Jishnu Mukhoti, et al.
16

Miscalibration – a mismatch between a model's confidence and its correctness – of Deep Neural Networks (DNNs) makes their predictions hard to rely on. Ideally, we want networks to be accurate, calibrated and confident. We show that, as opposed to the standard cross-entropy loss, focal loss (Lin et al., 2017) allows us to learn models that are already very well calibrated. When combined with temperature scaling, whilst preserving accuracy, it yields state-of-the-art calibrated models. We provide a thorough analysis of the factors causing miscalibration, and use the insights we glean from this to justify the empirically excellent performance of focal loss. To facilitate the use of focal loss in practice, we also provide a principled approach to automatically select the hyperparameter involved in the loss function. We perform extensive experiments on a variety of computer vision and NLP datasets, and with a wide variety of network architectures, and show that our approach achieves state-of-the-art accuracy and calibration in almost all cases.

READ FULL TEXT
10/27/2018

A New Loss Function for Temperature Scaling to have Better Calibrated Deep Networks

However Deep neural networks recently have achieved impressive results f...
11/29/2022

NCTV: Neural Clamping Toolkit and Visualization for Neural Network Calibration

With the advancement of deep learning technology, neural networks have d...
12/20/2020

Towards Trustworthy Predictions from Deep Neural Networks with Fast Adversarial Calibration

To facilitate a wide-spread acceptance of AI systems guiding decision ma...
09/28/2018

Confidence Calibration in Deep Neural Networks through Stochastic Inferences

We propose a generic framework to calibrate accuracy and confidence (sco...
08/23/2019

Calibration of Deep Probabilistic Models with Decoupled Bayesian Neural Networks

Deep Neural Networks (DNNs) have achieved state-of-the-art accuracy perf...
11/21/2022

AdaFocal: Calibration-aware Adaptive Focal Loss

Much recent work has been devoted to the problem of ensuring that a neur...
06/29/2022

RegMixup: Mixup as a Regularizer Can Surprisingly Improve Accuracy and Out Distribution Robustness

We show that the effectiveness of the well celebrated Mixup [Zhang et al...

Code Repositories

focal_calibration

Code for the paper "Calibrating Deep Neural Networks using Focal Loss"


view repo

meta-calibration

Official PyTorch implementation of "Meta-Calibration: Meta-Learning of Model Calibration Using Differentiable Expected Calibration Error"


view repo

Please sign up or login with your details

Forgot password? Click here to reset