Fusing Context Into Knowledge Graph for Commonsense Reasoning

12/09/2020
by   Yichong Xu, et al.
14

Commonsense reasoning requires a model to make presumptions about world events via language understanding. Many methods couple pre-trained language models with knowledge graphs in order to combine the merits in language modeling and entity-based relational learning. However, although a knowledge graph contains rich structural information, it lacks the context to provide a more precise understanding of the concepts and relations. This creates a gap when fusing knowledge graphs into language modeling, especially in the scenario of insufficient paired text-knowledge data. In this paper, we propose to utilize external entity description to provide contextual information for graph entities. For the CommonsenseQA task, our model first extracts concepts from the question and choice, and then finds a related triple between these concepts. Next, it retrieves the descriptions of these concepts from Wiktionary and feed them as additional input to a pre-trained language model, together with the triple. The resulting model can attain much more effective commonsense reasoning capability, achieving state-of-the-art results in the CommonsenseQA dataset with an accuracy of 80.7 the official leaderboard.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2020

JAKET: Joint Pre-training of Knowledge Graph and Language Understanding

Knowledge graphs (KGs) contain rich information about world knowledge, e...
research
06/08/2021

TIMEDIAL: Temporal Commonsense Reasoning in Dialog

Everyday conversations require understanding everyday events, which in t...
research
05/04/2023

Toward the Automated Construction of Probabilistic Knowledge Graphs for the Maritime Domain

International maritime crime is becoming increasingly sophisticated, oft...
research
10/24/2020

Learning Contextualized Knowledge Structures for Commonsense Reasoning

Recently, neural-symbolic architectures have achieved success on commons...
research
03/12/2023

LUKE-Graph: A Transformer-based Approach with Gated Relational Graph Attention for Cloze-style Reading Comprehension

Incorporating prior knowledge can improve existing pre-training models i...
research
05/03/2023

Causality-aware Concept Extraction based on Knowledge-guided Prompting

Concepts benefit natural language understanding but are far from complet...

Please sign up or login with your details

Forgot password? Click here to reset