Improved Predictive Uncertainty using Corruption-based Calibration

06/07/2021
by   Tiago Salvador, et al.
0

We propose a simple post hoc calibration method to estimate the confidence/uncertainty that a model prediction is correct on data with covariate shift, as represented by the large-scale corrupted data benchmark [Ovadia et al, 2019]. We achieve this by synthesizing surrogate calibration sets by corrupting the calibration set with varying intensities of a known corruption. Our method demonstrates significant improvements on the benchmark on a wide range of covariate shifts.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset