Parallel corpora are ideal for extracting a multilingual named entity (M...
In previous work, it has been shown that BERT can adequately align
cross...
Multilingual pretrained language models (MPLMs) exhibit multilinguality ...
With the advent of end-to-end deep learning approaches in machine
transl...
The size of the vocabulary is a central design choice in large pretraine...
With more than 7000 languages worldwide, multilingual natural language
p...
Recent research investigates factual knowledge stored in large pretraine...
Transformers are arguably the main workhorse in recent Natural Language
...
Recently, it has been found that monolingual English language models can...
Annotation projection is an important area in NLP that can greatly contr...
We present a novel encoder-decoder architecture for graph-to-text genera...
It has been shown that multilingual BERT (mBERT) yields high quality
mul...
Pretrained language models have achieved a new state of the art on many ...
Word alignments are useful for tasks like statistical and neural machine...
Word embeddings are useful for a wide variety of tasks, but they lack
in...
Levy, Søgaard and Goldberg's (2017) S-ID (sentence ID) method applies
wo...
Multilingual embeddings build on the success of monolingual embeddings a...