Predicting with Confidence on Unseen Distributions

07/07/2021
by   Devin Guillory, et al.
25

Recent work has shown that the performance of machine learning models can vary substantially when models are evaluated on data drawn from a distribution that is close to but different from the training distribution. As a result, predicting model performance on unseen distributions is an important challenge. Our work connects techniques from domain adaptation and predictive uncertainty literature, and allows us to predict model accuracy on challenging unseen distributions without access to labeled data. In the context of distribution shift, distributional distances are often used to adapt models and improve their performance on new domains, however accuracy estimation, or other forms of predictive uncertainty, are often neglected in these investigations. Through investigating a wide range of established distributional distances, such as Frechet distance or Maximum Mean Discrepancy, we determine that they fail to induce reliable estimates of performance under distribution shift. On the other hand, we find that the difference of confidences (DoC) of a classifier's predictions successfully estimates the classifier's performance change over a variety of shifts. We specifically investigate the distinction between synthetic and natural distribution shifts and observe that despite its simplicity DoC consistently outperforms other quantifications of distributional difference. DoC reduces predictive error by almost half (46%) on several realistic and challenging distribution shifts, e.g., on the ImageNet-Vid-Robust and ImageNet-Rendition datasets.

READ FULL TEXT

page 9

page 10

page 13

page 14

research
09/01/2021

Evaluating Predictive Uncertainty under Distributional Shift on Dialogue Dataset

In open-domain dialogues, predictive uncertainties are mainly evaluated ...
research
07/09/2021

Accuracy on the Line: On the Strong Correlation Between Out-of-Distribution and In-Distribution Generalization

For machine learning systems to be reliable, we must understand their pe...
research
07/15/2021

Statistical modeling of corneal OCT speckle. A distributional model-free approach

In biomedical optics, it is often of interest to statistically model the...
research
08/14/2023

Distance Matters For Improving Performance Estimation Under Covariate Shift

Performance estimation under covariate shift is a crucial component of s...
research
08/10/2020

Robust Validation: Confident Predictions Even When Distributions Shift

While the traditional viewpoint in machine learning and statistics assum...
research
02/02/2023

Confidence and Dispersity Speak: Characterising Prediction Matrix for Unsupervised Accuracy Estimation

This work aims to assess how well a model performs under distribution sh...
research
10/09/2022

Test-time Recalibration of Conformal Predictors Under Distribution Shift Based on Unlabeled Examples

Modern image classifiers achieve high predictive accuracy, but the predi...

Please sign up or login with your details

Forgot password? Click here to reset