MultiFiT: Efficient Multi-lingual Language Model Fine-tuning

09/10/2019
by   Julian Eisenschlos, et al.
0

Pretrained language models are promising particularly for low-resource languages as they only require unlabelled data. However, training existing models requires huge amounts of compute, while pretrained cross-lingual models often underperform on low-resource languages. We propose Multi-lingual language model Fine-Tuning (MultiFiT) to enable practitioners to train and fine-tune language models efficiently in their own language. In addition, we propose a zero-shot method using an existing pretrained cross-lingual model. We evaluate our methods on two widely used cross-lingual classification datasets where they outperform models pretrained on orders of magnitude more data and compute. We release all models and code.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2019

Can Monolingual Pretrained Models Help Cross-Lingual Classification?

Multilingual pretrained language models (such as multilingual BERT) have...
research
10/10/2021

Injecting Text and Cross-lingual Supervision in Few-shot Learning from Self-Supervised Models

Self-supervised model pre-training has recently garnered significant int...
research
12/01/2021

DPRK-BERT: The Supreme Language Model

Deep language models have achieved remarkable success in the NLP domain....
research
03/03/2023

Team Hitachi at SemEval-2023 Task 3: Exploring Cross-lingual Multi-task Strategies for Genre and Framing Detection in Online News

This paper explains the participation of team Hitachi to SemEval-2023 Ta...
research
12/20/2022

Mini-Model Adaptation: Efficiently Extending Pretrained Models to New Languages via Aligned Shallow Training

Prior work has shown that it is possible to expand pretrained Masked Lan...
research
09/25/2021

Language Model Priming for Cross-Lingual Event Extraction

We present a novel, language-agnostic approach to "priming" language mod...
research
09/10/2021

Efficient Test Time Adapter Ensembling for Low-resource Language Varieties

Adapters are light-weight modules that allow parameter-efficient fine-tu...

Please sign up or login with your details

Forgot password? Click here to reset