Beyond Invariance: Test-Time Label-Shift Adaptation for Distributions with "Spurious" Correlations

by   Qingyao Sun, et al.

Spurious correlations, or correlations that change across domains where a model can be deployed, present significant challenges to real-world applications of machine learning models. However, such correlations are not always "spurious"; often, they provide valuable prior information for a prediction beyond what can be extracted from the input alone. Here, we present a test-time adaptation method that exploits the spurious correlation phenomenon, in contrast to recent approaches that attempt to eliminate spurious correlations through invariance. We consider situations where the prior distribution p(y, z), which models the marginal dependence between the class label y and the nuisance factors z, may change across domains, but the generative model for features p(𝐱|y, z) is constant. We note that this is an expanded version of the label shift assumption, where the labels now also include the nuisance factors z. Based on this observation, we train a classifier to predict p(y, z|𝐱) on the source distribution, and implement a test-time label shift correction that adapts to changes in the marginal distribution p(y, z) using unlabeled samples from the target domain. We call our method "Test-Time Label-Shift Adaptation" or TTLSA. We apply our method to two different image datasets – the CheXpert chest X-ray dataset and the colored MNIST dataset – and show that it gives better downstream results than methods that try to train classifiers which are invariant to the changes in prior distribution. Code reproducing experiments is available at .


page 7

page 11


Revisiting Realistic Test-Time Training: Sequential Inference and Adaptation by Anchored Clustering

Deploying models on target domain data subject to distribution shift req...

Equivariant Disentangled Transformation for Domain Generalization under Combination Shift

Machine learning systems may encounter unexpected problems when the data...

Label Shift Adapter for Test-Time Adaptation under Covariate and Label Shifts

Test-time adaptation (TTA) aims to adapt a pre-trained model to the targ...

Domain Adaptation under Open Set Label Shift

We introduce the problem of domain adaptation under Open Set Label Shift...

Addressing Distribution Shift at Test Time in Pre-trained Language Models

State-of-the-art pre-trained language models (PLMs) outperform other mod...

Robust Classification under Class-Dependent Domain Shift

Investigation of machine learning algorithms robust to changes between t...

Entropy Penalty: Towards Generalization Beyond the IID Assumption

It has been shown that instead of learning actual object features, deep ...

Please sign up or login with your details

Forgot password? Click here to reset