DBCal: Density Based Calibration of classifier predictions for uncertainty quantification

04/01/2022
by   Alex Hagen, et al.
0

Measurement of uncertainty of predictions from machine learning methods is important across scientific domains and applications. We present, to our knowledge, the first such technique that quantifies the uncertainty of predictions from a classifier and accounts for both the classifier's belief and performance. We prove that our method provides an accurate estimate of the probability that the outputs of two neural networks are correct by showing an expected calibration error of less than 0.2 than 3 empirically show that the uncertainty returned by our method is an accurate measurement of the probability that the classifier's prediction is correct and, therefore has broad utility in uncertainty propagation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2022

NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural Networks

This paper proposes a fast and scalable method for uncertainty quantific...
research
03/20/2019

Performance Measurement for Deep Bayesian Neural Network

Deep Bayesian neural network has aroused a great attention in recent yea...
research
03/31/2023

Accounting for Vibration Noise in Stochastic Measurement Errors

The measurement of data over time and/or space is of utmost importance i...
research
09/17/2022

Unveil the unseen: Exploit information hidden in noise

Noise and uncertainty are usually the enemy of machine learning, noise i...
research
02/07/2022

Theoretical characterization of uncertainty in high-dimensional linear classification

Being able to reliably assess not only the accuracy but also the uncerta...
research
02/21/2023

Don't guess what's true: choose what's optimal. A probability transducer for machine-learning classifiers

In fields such as medicine and drug discovery, the ultimate goal of a cl...

Please sign up or login with your details

Forgot password? Click here to reset