Understanding new tasks through the lens of training data via exponential tilting

05/26/2022
by   Subha Maity, et al.
0

Deploying machine learning models to new tasks is a major challenge despite the large size of the modern training datasets. However, it is conceivable that the training data can be reweighted to be more representative of the new (target) task. We consider the problem of reweighing the training samples to gain insights into the distribution of the target task. Specifically, we formulate a distribution shift model based on the exponential tilt assumption and learn train data importance weights minimizing the KL divergence between labeled train and unlabeled target datasets. The learned train data weights can then be used for downstream tasks such as target performance evaluation, fine-tuning, and model selection. We demonstrate the efficacy of our method on Waterbirds and Breeds benchmarks.

READ FULL TEXT
research
07/19/2021

Adaptive Transfer Learning on Graph Neural Networks

Graph neural networks (GNNs) is widely used to learn a powerful represen...
research
02/28/2017

Borrowing Treasures from the Wealthy: Deep Transfer Learning through Selective Joint Fine-tuning

Deep neural networks require a large amount of labeled training data dur...
research
11/03/2020

Meta-learning Transferable Representations with a Single Target Domain

Recent works found that fine-tuning and joint training—two popular appro...
research
12/23/2022

Principled and Efficient Transfer Learning of Deep Models via Neural Collapse

With the ever-growing model size and the limited availability of labeled...
research
11/29/2022

Building Resilience to Out-of-Distribution Visual Data via Input Optimization and Model Finetuning

A major challenge in machine learning is resilience to out-of-distributi...
research
02/06/2023

Data Selection for Language Models via Importance Resampling

Selecting a suitable training dataset is crucial for both general-domain...
research
06/12/2020

Learning Diverse Representations for Fast Adaptation to Distribution Shift

The i.i.d. assumption is a useful idealization that underpins many succe...

Please sign up or login with your details

Forgot password? Click here to reset