Controlled abstention neural networks for identifying skillful predictions for regression problems

04/16/2021
by   Elizabeth A. Barnes, et al.
0

The earth system is exceedingly complex and often chaotic in nature, making prediction incredibly challenging: we cannot expect to make perfect predictions all of the time. Instead, we look for specific states of the system that lead to more predictable behavior than others, often termed "forecasts of opportunity". When these opportunities are not present, scientists need prediction systems that are capable of saying "I don't know." We introduce a novel loss function, termed "abstention loss", that allows neural networks to identify forecasts of opportunity for regression problems. The abstention loss works by incorporating uncertainty in the network's prediction to identify the more confident samples and abstain (say "I don't know") on the less confident samples. The abstention loss is designed to determine the optimal abstention fraction, or abstain on a user-defined fraction via a PID controller. Unlike many methods for attaching uncertainty to neural network predictions post-training, the abstention loss is applied during training to preferentially learn from the more confident samples. The abstention loss is built upon a standard computer science method. While the standard approach is itself a simple yet powerful tool for incorporating uncertainty in regression problems, we demonstrate that the abstention loss outperforms this more standard method for the synthetic climate use cases explored here. The implementation of proposed loss function is straightforward in most network architectures designed for regression, as it only requires modification of the output layer and loss function.

READ FULL TEXT
research
04/16/2021

Controlled abstention neural networks for identifying skillful predictions for classification problems

The earth system is exceedingly complex and often chaotic in nature, mak...
research
12/28/2020

Generalized Quantile Loss for Deep Neural Networks

This note presents a simple way to add a count (or quantile) constraint ...
research
10/27/2018

A New Loss Function for Temperature Scaling to have Better Calibrated Deep Networks

However Deep neural networks recently have achieved impressive results f...
research
06/29/2019

Deep Gamblers: Learning to Abstain with Portfolio Theory

We deal with the selective classification problem (supervised-learning p...
research
02/06/2021

Extremal learning: extremizing the output of a neural network in regression problems

Neural networks allow us to model complex relationships between variable...
research
05/12/2022

Training Uncertainty-Aware Classifiers with Conformalized Deep Learning

Deep neural networks are powerful tools to detect hidden patterns in dat...
research
10/03/2013

Multivariate regression and fit function uncertainty

This article describes a multivariate polynomial regression method where...

Please sign up or login with your details

Forgot password? Click here to reset