Mixed-Lingual Pre-training for Cross-lingual Summarization

10/18/2020
by   Ruochen Xu, et al.
7

Cross-lingual Summarization (CLS) aims at producing a summary in the target language for an article in the source language. Traditional solutions employ a two-step approach, i.e. translate then summarize or summarize then translate. Recently, end-to-end models have achieved better results, but these approaches are mostly limited by their dependence on large-scale labeled data. We propose a solution based on mixed-lingual pre-training that leverages both cross-lingual tasks such as translation and monolingual tasks like masked language models. Thus, our model can leverage the massive monolingual data to enhance its modeling of language. Moreover, the architecture has no task-specific components, which saves memory and increases optimization efficiency. We show in experiments that this pre-training scheme can effectively boost the performance of cross-lingual summarization. In Neural Cross-Lingual Summarization (NCLS) dataset, our model achieves an improvement of 2.82 (English to Chinese) and 1.15 (Chinese to English) ROUGE-1 scores over state-of-the-art results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2019

NCLS: Neural Cross-Lingual Summarization

Cross-lingual summarization (CLS) is the task to produce a summary in on...
research
03/05/2022

ClueGraphSum: Let Key Clues Guide the Cross-Lingual Abstractive Summarization

Cross-Lingual Summarization (CLS) is the task to generate a summary in o...
research
03/08/2022

A Variational Hierarchical Model for Neural Cross-Lingual Summarization

The goal of the cross-lingual summarization (CLS) is to convert a docume...
research
04/16/2021

ProphetNet-X: Large-Scale Pre-training Models for English, Chinese, Multi-lingual, Dialog, and Code Generation

Now, the pre-training technique is ubiquitous in natural language proces...
research
03/23/2022

A Survey on Cross-Lingual Summarization

Cross-lingual summarization is the task of generating a summary in one l...
research
10/15/2020

Multi-Task Learning for Cross-Lingual Abstractive Summarization

We present a multi-task learning framework for cross-lingual abstractive...
research
04/15/2021

A Survey of Recent Abstract Summarization Techniques

This paper surveys several recent abstract summarization methods: T5, Pe...

Please sign up or login with your details

Forgot password? Click here to reset