Example-based Hypernetworks for Out-of-Distribution Generalization

03/27/2022
by   Tomer Volk, et al.
0

While Natural Language Processing (NLP) algorithms keep reaching unprecedented milestones, out-of-distribution generalization is still challenging. In this paper we address the problem of multi-source adaptation to unknown domains: Given labeled data from multiple source domains, we aim to generalize to data drawn from target domains that are unknown to the algorithm at training time. We present an algorithmic framework based on example-based Hypernetwork adaptation: Given an input example, a T5 encoder-decoder first generates a unique signature which embeds this example in the semantic space of the source domains, and this signature is then fed into a Hypernetwork which generates the weights of the task classifier. In an advanced version of our model, the learned signature also serves for improving the representation of the input example. In experiments with two tasks, sentiment classification and natural language inference, across 29 adaptation settings, our algorithms substantially outperform existing algorithms for this adaptation setup. To the best of our knowledge, this is the first time Hypernetworks are applied to domain adaptation or in example-based manner in NLP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2021

PADA: A Prompt-based Autoregressive Approach for Adaptation to Unseen Domains

Natural Language Processing algorithms have made incredible progress rec...
research
09/02/2022

Domain Adaptation from Scratch

Natural language processing (NLP) algorithms are rapidly improving but o...
research
02/24/2022

DoCoGen: Domain Counterfactual Generation for Low Resource Domain Adaptation

Natural language processing (NLP) algorithms have become very successful...
research
05/31/2020

Neural Unsupervised Domain Adaptation in NLP—A Survey

Deep neural networks excel at learning from labeled data and achieve sta...
research
10/02/2019

Tree-Structured Semantic Encoder with Knowledge Sharing for Domain Adaptation in Natural Language Generation

Domain adaptation in natural language generation (NLG) remains challengi...
research
10/05/2016

Neural Structural Correspondence Learning for Domain Adaptation

Domain adaptation, adapting models from domains rich in labeled training...
research
11/11/2022

Rethinking Data-driven Networking with Foundation Models: Challenges and Opportunities

Foundational models have caused a paradigm shift in the way artificial i...

Please sign up or login with your details

Forgot password? Click here to reset