What the [MASK]? Making Sense of Language-Specific BERT Models

03/05/2020
by   Debora Nozza, et al.
1

Recently, Natural Language Processing (NLP) has witnessed an impressive progress in many areas, due to the advent of novel, pretrained contextual representation models. In particular, Devlin et al. (2019) proposed a model, called BERT (Bidirectional Encoder Representations from Transformers), which enables researchers to obtain state-of-the art performance on numerous NLP tasks by fine-tuning the representations on their data set and task, without the need for developing and training highly-specific architectures. The authors also released multilingual BERT (mBERT), a model trained on a corpus of 104 languages, which can serve as a universal language model. This model obtained impressive results on a zero-shot cross-lingual natural inference task. Driven by the potential of BERT models, the NLP community has started to investigate and generate an abundant number of BERT models that are trained on a particular language, and tested on a specific data domain and task. This allows us to evaluate the true potential of mBERT as a universal language model, by comparing it to the performance of these more specific models. This paper presents the current state of the art in language-specific BERT models, providing an overall picture with respect to different dimensions (i.e. architectures, data domains, and tasks). Our aim is to provide an immediate and straightforward overview of the commonalities and differences between Language-Specific (language-specific) BERT models and mBERT. We also provide an interactive and constantly updated website that can be used to explore the information we have collected, at https://bertlang.unibocconi.it.

READ FULL TEXT
research
04/19/2019

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

Pretrained contextual representation models (Peters et al., 2018; Devlin...
research
01/19/2022

TourBERT: A pretrained language model for the tourism industry

The Bidirectional Encoder Representations from Transformers (BERT) is cu...
research
05/16/2019

Latent Universal Task-Specific BERT

This paper describes a language representation model which combines the ...
research
02/24/2021

RoBERTa-wwm-ext Fine-Tuning for Chinese Text Classification

Bidirectional Encoder Representations from Transformers (BERT) have show...
research
12/07/2020

Detecting Insincere Questions from Text: A Transfer Learning Approach

The internet today has become an unrivalled source of information where ...
research
07/28/2020

GUIR at SemEval-2020 Task 12: Domain-Tuned Contextualized Models for Offensive Language Detection

Offensive language detection is an important and challenging task in nat...
research
05/31/2022

The Contribution of Lyrics and Acoustics to Collaborative Understanding of Mood

In this work, we study the association between song lyrics and mood thro...

Please sign up or login with your details

Forgot password? Click here to reset