Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation

04/30/2020
by   Asa Cooper Stickland, et al.
0

There has been recent success in pre-training on monolingual data and fine-tuning on Machine Translation (MT), but it remains unclear how to best leverage a pre-trained model for a given MT task. This paper investigates the benefits and drawbacks of freezing parameters, and adding new ones, when fine-tuning a pre-trained model on MT. We focus on 1) Fine-tuning a model trained only on English monolingual data, BART. 2) Fine-tuning a model trained on monolingual data from 25 languages, mBART. For BART we get the best performance by freezing most of the model parameters, and adding extra positional embeddings. For mBART we match the performance of naive fine-tuning for most language pairs, and outperform it for Nepali to English (0.5 BLEU) and Czech to English (0.6 BLEU), all with a lower memory cost at training time. When constraining ourselves to an out-of-domain training set for Vietnamese to English we outperform the fine-tuning baseline by 0.9 BLEU.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

When does Parameter-Efficient Transfer Learning Work for Machine Translation?

Parameter-efficient fine-tuning methods (PEFTs) offer the promise of ada...
research
09/14/2021

Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission

This paper describes Netmarble's submission to WMT21 Automatic Post-Edit...
research
12/04/2019

Acquiring Knowledge from Pre-trained Model to Neural Machine Translation

Pre-training and fine-tuning have achieved great success in the natural ...
research
12/07/2022

M3ST: Mix at Three Levels for Speech Translation

How to solve the data scarcity problem for end-to-end speech-to-text tra...
research
09/15/2022

Examining Large Pre-Trained Language Models for Machine Translation: What You Don't Know About It

Pre-trained language models (PLMs) often take advantage of the monolingu...
research
10/26/2022

Robust Domain Adaptation for Pre-trained Multilingual Neural Machine Translation Models

Recent literature has demonstrated the potential of multilingual Neural ...
research
10/27/2022

The Effect of Normalization for Bi-directional Amharic-English Neural Machine Translation

Machine translation (MT) is one of the main tasks in natural language pr...

Please sign up or login with your details

Forgot password? Click here to reset