Marvelous Agglutinative Language Effect on Cross Lingual Transfer Learning

04/08/2022
by   Wooyoung Kim, et al.
0

As for multilingual language models, it is important to select languages for training because of the curse of multilinguality. (Conneau et al., 2020). It is known that using languages with similar language structures is effective for cross lingual transfer learning (Pires et al., 2019). However, we demonstrate that using agglutinative languages such as Korean is more effective in cross lingual transfer learning. This is a great discovery that will change the training strategy of cross lingual transfer learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2022

Data-Efficient Cross-Lingual Transfer with Language-Specific Subnetworks

Large multilingual language models typically share their parameters acro...
research
06/03/2021

Language Embeddings for Typology and Cross-lingual Transfer Learning

Cross-lingual language tasks typically require a substantial amount of a...
research
05/09/2022

Enhancing Cross-lingual Transfer by Manifold Mixup

Based on large-scale pre-trained multilingual representations, recent cr...
research
04/03/2023

Efficiently Aligned Cross-Lingual Transfer Learning for Conversational Tasks using Prompt-Tuning

Cross-lingual transfer of language models trained on high-resource langu...
research
05/23/2022

Cross-lingual Lifelong Learning

The longstanding goal of multi-lingual learning has been to develop a un...
research
08/05/2021

EENLP: Cross-lingual Eastern European NLP Index

This report presents the results of the EENLP project, done as a part of...
research
10/08/2018

Zero-Resource Multilingual Model Transfer: Learning What to Share

Modern natural language processing and understanding applications have e...

Please sign up or login with your details

Forgot password? Click here to reset