Impact of Domain-Adapted Multilingual Neural Machine Translation in the Medical Domain

12/05/2022
by   Miguel Rios, et al.
0

Multilingual Neural Machine Translation (MNMT) models leverage many language pairs during training to improve translation quality for low-resource languages by transferring knowledge from high-resource languages. We study the quality of a domain-adapted MNMT model in the medical domain for English-Romanian with automatic metrics and a human error typology annotation which includes terminology-specific error categories. We compare the out-of-domain MNMT with the in-domain adapted MNMT. The in-domain MNMT model outperforms the out-of-domain MNMT in all measured automatic metrics and produces fewer terminology errors.

READ FULL TEXT
research
04/01/2021

Many-to-English Machine Translation Tools, Data, and Pretrained Models

While there are more than 7000 languages in the world, most translation ...
research
06/30/2020

GShard: Scaling Giant Models with Conditional Computation and Automatic Sharding

Neural network scaling has been critical for improving the model quality...
research
02/17/2021

Sparsely Factored Neural Machine Translation

The standard approach to incorporate linguistic information to neural ma...
research
01/06/2016

Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism

We propose multi-way, multilingual neural machine translation. The propo...
research
04/21/2019

UniSent: Universal Adaptable Sentiment Lexica for 1000+ Languages

In this paper, we introduce UniSent a universal sentiment lexica for 100...
research
10/09/2016

Enabling Medical Translation for Low-Resource Languages

We present research towards bridging the language gap between migrant wo...
research
02/19/2023

Scaling Laws for Multilingual Neural Machine Translation

In this work, we provide a large-scale empirical study of the scaling pr...

Please sign up or login with your details

Forgot password? Click here to reset