Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models

09/29/2020
by   Yusheng Su, et al.
0

Several recent efforts have been devoted to enhancing pre-trained language models (PLMs) by utilizing extra heterogeneous knowledge in knowledge graphs (KGs), and achieved consistent improvements on various knowledge-driven NLP tasks. However, most of these knowledge-enhanced PLMs embed static sub-graphs of KGs ("knowledge context"), regardless of that the knowledge required by PLMs may change dynamically according to specific text ("textual context"). In this paper, we propose a novel framework named DKPLM to dynamically select and embed knowledge context according to textual context for PLMs, which can avoid the effect of redundant and ambiguous knowledge in KGs that cannot match the input text. Our experimental results show that DKPLM outperforms various baselines on typical knowledge-driven NLP tasks, indicating the effectiveness of utilizing dynamic knowledge context for language understanding. Besides the performance improvements, the dynamically selected knowledge in DKPLM can describe the semantics of text-related knowledge in a more interpretable form than the conventional PLMs. Our source code and datasets will be available to provide more details for DKPLM.

READ FULL TEXT
research
05/17/2019

ERNIE: Enhanced Language Representation with Informative Entities

Neural language representation models such as BERT pre-trained on large-...
research
03/17/2022

Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations

With the emerging research effort to integrate structured and unstructur...
research
10/02/2020

JAKET: Joint Pre-training of Knowledge Graph and Language Understanding

Knowledge graphs (KGs) contain rich information about world knowledge, e...
research
04/15/2018

Context and Humor: Understanding Amul advertisements of India

Contextual knowledge is the most important element in understanding lang...
research
05/02/2023

UNTER: A Unified Knowledge Interface for Enhancing Pre-trained Language Models

Recent research demonstrates that external knowledge injection can advan...
research
08/01/2022

DictBERT: Dictionary Description Knowledge Enhanced Language Model Pre-training via Contrastive Learning

Although pre-trained language models (PLMs) have achieved state-of-the-a...
research
10/13/2021

Towards Efficient NLP: A Standard Evaluation and A Strong Baseline

Supersized pre-trained language models have pushed the accuracy of vario...

Please sign up or login with your details

Forgot password? Click here to reset