From Clozing to Comprehending: Retrofitting Pre-trained Language Model to Pre-trained Machine Reader

12/09/2022
by   Weiwen Xu, et al.
0

We present Pre-trained Machine Reader (PMR), a novel method to retrofit Pre-trained Language Models (PLMs) into Machine Reading Comprehension (MRC) models without acquiring labeled data. PMR is capable of resolving the discrepancy between model pre-training and downstream fine-tuning of existing PLMs, and provides a unified solver for tackling various extraction tasks. To achieve this, we construct a large volume of general-purpose and high-quality MRC-style training data with the help of Wikipedia hyperlinks and design a Wiki Anchor Extraction task to guide the MRC-style pre-training process. Although conceptually simple, PMR is particularly effective in solving extraction tasks including Extractive Question Answering and Named Entity Recognition, where it shows tremendous improvements over previous approaches especially under low-resource settings. Moreover, viewing sequence classification task as a special case of extraction task in our MRC formulation, PMR is even capable to extract high-quality rationales to explain the classification process, providing more explainability of the predictions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/25/2023

ViDeBERTa: A powerful pre-trained language model for Vietnamese

This paper presents ViDeBERTa, a new pre-trained monolingual language mo...
research
04/11/2022

A Comparative Study of Pre-trained Encoders for Low-Resource Named Entity Recognition

Pre-trained language models (PLM) are effective components of few-shot n...
research
10/12/2020

Multi-Stage Pre-training for Low-Resource Domain Adaptation

Transfer learning techniques are particularly useful in NLP tasks where ...
research
10/14/2019

Training Compact Models for Low Resource Entity Tagging using Pre-trained Language Models

Training models on low-resource named entity recognition tasks has been ...
research
07/19/2021

Bridging the Gap between Language Model and Reading Comprehension: Unsupervised MRC via Self-Supervision

Despite recent success in machine reading comprehension (MRC), learning ...
research
03/26/2020

Common-Knowledge Concept Recognition for SEVA

We build a common-knowledge concept recognition system for a Systems Eng...
research
10/18/2022

Alibaba-Translate China's Submission for WMT 2022 Metrics Shared Task

In this report, we present our submission to the WMT 2022 Metrics Shared...

Please sign up or login with your details

Forgot password? Click here to reset