Modeling Target-side Inflection in Placeholder Translation

07/01/2021
by   Ryokan Ri, et al.
0

Placeholder translation systems enable the users to specify how a specific phrase is translated in the output sentence. The system is trained to output special placeholder tokens, and the user-specified term is injected into the output through the context-free replacement of the placeholder token. However, this approach could result in ungrammatical sentences because it is often the case that the specified term needs to be inflected according to the context of the output, which is unknown before the translation. To address this problem, we propose a novel method of placeholder translation that can inflect specified terms according to the grammatical construction of the output sentence. We extend the sequence-to-sequence architecture with a character-level decoder that takes the lemma of a user-specified term and the words generated from the word-level decoder to output the correct inflected form of the lemma. We evaluate our approach with a Japanese-to-English translation task in the scientific writing domain, and show that our model can incorporate specified terms in the correct form more successfully than other comparable models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2018

Sentence-wise Smooth Regularization for Sequence to Sequence Learning

Maximum-likelihood estimation (MLE) is widely used in sequence to sequen...
research
08/19/2016

Learning to Start for Sequence to Sequence Architecture

The sequence to sequence architecture is widely used in the response gen...
research
09/14/2017

Global-Context Neural Machine Translation through Target-Side Attentive Residual Connections

Neural sequence-to-sequence models achieve remarkable performance not on...
research
07/27/2017

Adapting Sequence Models for Sentence Correction

In a controlled experiment of sequence-to-sequence approaches for the ta...
research
06/06/2016

Neural Machine Translation with External Phrase Memory

In this paper, we propose phraseNet, a neural machine translator with a ...
research
09/03/2019

Context-Aware Monolingual Repair for Neural Machine Translation

Modern sentence-level NMT systems often produce plausible translations o...
research
09/01/2018

Contextual Encoding for Translation Quality Estimation

The task of word-level quality estimation (QE) consists of taking a sour...

Please sign up or login with your details

Forgot password? Click here to reset