AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages

11/07/2022
by   Bonaventure F. P. Dossou, et al.
4

In recent years, multilingual pre-trained language models have gained prominence due to their remarkable performance on numerous downstream Natural Language Processing tasks (NLP). However, pre-training these large multilingual language models requires a lot of training data, which is not available for African Languages. Active learning is a semi-supervised learning algorithm, in which a model consistently and dynamically learns to identify the most beneficial samples to train itself on, in order to achieve better optimization and performance on downstream tasks. Furthermore, active learning effectively and practically addresses real-world data scarcity. Despite all its benefits, active learning, in the context of NLP and especially multilingual language models pretraining, has received little consideration. In this paper, we present AfroLM, a multilingual language model pretrained from scratch on 23 African languages (the largest effort to date) using our novel self-active learning framework. Pretrained on a dataset significantly (14x) smaller than existing baselines, AfroLM outperforms many multilingual pretrained language models (AfriBERTa, XLMR-base, mBERT) on various NLP downstream tasks (NER, text classification, and sentiment analysis). Additional out-of-domain sentiment analysis experiments show that AfroLM is able to generalize well across various domains. We release the code source, and our datasets used in our framework at https://github.com/bonaventuredossou/MLM_AL.

READ FULL TEXT
research
12/04/2022

MiLMo:Minority Multilingual Pre-trained Language Model

Pre-trained language models are trained on large-scale unsupervised data...
research
01/06/2022

Fortunately, Discourse Markers Can Enhance Language Models for Sentiment Analysis

In recent years, pretrained language models have revolutionized the NLP ...
research
10/22/2021

ClimateBert: A Pretrained Language Model for Climate-Related Text

Over the recent years, large pretrained language models (LM) have revolu...
research
10/31/2022

Active Learning of Non-semantic Speech Tasks with Pretrained Models

Pretraining neural networks with massive unlabeled datasets has become p...
research
02/16/2022

FAMIE: A Fast Active Learning Framework for Multilingual Information Extraction

This paper presents FAMIE, a comprehensive and efficient active learning...
research
10/29/2019

Understand customer reviews with less data and in short time: pretrained language representation and active learning

In this paper, we address customer review understanding problems by usin...
research
09/03/2021

ALLWAS: Active Learning on Language models in WASserstein space

Active learning has emerged as a standard paradigm in areas with scarcit...

Please sign up or login with your details

Forgot password? Click here to reset