Extreme Multi-Domain, Multi-Task Learning With Unified Text-to-Text Transfer Transformers

09/21/2022
by   Adebayo Oshingbesan, et al.
33

Text-to-text transformers have shown remarkable success in the task of multi-task transfer learning, especially in natural language processing (NLP). However, while there have been several attempts to train transformers on different domains, there is usually a clear relationship between these domains, e.g.,, code summarization, where the natural language summary describes the code. There have been very few attempts to study how multi-task transfer learning works on tasks in significantly different domains. In this project, we investigated the behavior of multi-domain, multi-task learning using multi-domain text-to-text transfer transformers (MD-T5) on four tasks across two domains - Python Code and Chess. We carried out extensive experiments using three popular training strategies: Bert-style joint pretraining + successive finetuning, GPT-style joint pretraining + successive finetuning, and GPT-style joint pretraining + joint finetuning. Also, we evaluate the model on four metrics - Play Score, Eval Score, BLEU Score, and Multi-Domain Learning Score (MDLS). These metrics measure performance across the various tasks and multi-domain learning. We show that while negative knowledge transfer and catastrophic forgetting are still considerable challenges for all the models, the GPT-style joint pretraining + joint finetuning strategy showed the most promise in multi-domain, multi-task learning as it performs well across all four tasks while still keeping its multi-domain knowledge.

READ FULL TEXT
research
04/07/2022

A Survey of Multi-task Learning in Natural Language Processing: Regarding Task Relatedness and Training Methods

Multi-task learning (MTL) has become increasingly popular in natural lan...
research
06/12/2018

Multi-Task Neural Models for Translating Between Styles Within and Across Languages

Generating natural language requires conveying content in an appropriate...
research
10/31/2022

Effective Cross-Task Transfer Learning for Explainable Natural Language Inference with T5

We compare sequential fine-tuning with a model for multi-task learning i...
research
05/06/2020

An Empirical Study of Multi-Task Learning on BERT for Biomedical Text Mining

Multi-task learning (MTL) has achieved remarkable success in natural lan...
research
10/26/2022

Analyzing Multi-Task Learning for Abstractive Text Summarization

Despite the recent success of multi-task learning and pre-finetuning for...
research
02/20/2023

Exploring the Limits of Transfer Learning with Unified Model in the Cybersecurity Domain

With the increase in cybersecurity vulnerabilities of software systems, ...
research
04/08/2019

Heterogeneous Multi-task Metric Learning across Multiple Domains

Distance metric learning (DML) plays a crucial role in diverse machine l...

Please sign up or login with your details

Forgot password? Click here to reset