Levenshtein Training for Word-level Quality Estimation

09/12/2021
by   Shuoyang Ding, et al.
0

We propose a novel scheme to use the Levenshtein Transformer to perform the task of word-level quality estimation. A Levenshtein Transformer is a natural fit for this task: trained to perform decoding in an iterative manner, a Levenshtein Transformer can learn to post-edit without explicit supervision. To further minimize the mismatch between the translation task and the word-level QE task, we propose a two-stage transfer learning procedure on both augmented data and human post-editing data. We also propose heuristics to construct reference labels that are compatible with subword-level finetuning and inference. Results on WMT 2020 QE shared task dataset show that our proposed method has superior data efficiency under the data-constrained setting and competitive performance under the unconstrained setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2021

The JHU-Microsoft Submission for WMT21 Quality Estimation Shared Task

This paper presents the JHU-Microsoft joint submission for WMT 2021 qual...
research
07/19/2022

Multilingual Transformer Encoders: a Word-Level Task-Agnostic Evaluation

Some Transformer-based models can perform cross-lingual transfer learnin...
research
09/23/2022

Extending Word-Level Quality Estimation for Post-Editing Assistance

We define a novel concept called extended word alignment in order to imp...
research
09/01/2018

MS-UEdin Submission to the WMT2018 APE Shared Task: Dual-Source Transformer for Automatic Post-Editing

This paper describes the Microsoft and University of Edinburgh submissio...
research
07/20/2022

Explicit Image Caption Editing

Given an image and a reference caption, the image caption editing task a...
research
08/09/2019

UdS Submission for the WMT 19 Automatic Post-Editing Task

In this paper, we describe our submission to the English-German APE shar...
research
04/27/2020

Explicitly Modeling Adaptive Depths for Transformer

The vanilla Transformer conducts a fixed number of computations over all...

Please sign up or login with your details

Forgot password? Click here to reset