Generating Logical Forms from Graph Representations of Text and Entities

05/21/2019
by   Peter Shaw, et al.
0

Structured information about entities is critical for many semantic parsing tasks. We present an approach that uses a Graph Neural Network (GNN) architecture to incorporate information about relevant entities and their relations during parsing. Combined with a decoder copy mechanism, this approach provides a conceptually simple mechanism to generate logical forms with entities. We demonstrate that this approach is competitive with state-of-the-art across several tasks without pre-training, and outperforms existing approaches when combined with BERT pre-training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2020

TAPAS: Weakly Supervised Table Parsing via Pre-training

Answering natural language questions over tables is usually seen as a se...
research
03/15/2022

Graph Pre-training for AMR Parsing and Generation

Abstract meaning representation (AMR) highlights the core semantic infor...
research
09/29/2016

Semantic Parsing with Semi-Supervised Sequential Autoencoders

We present a novel semi-supervised approach for sequence transduction an...
research
10/19/2020

An Empirical Study for Vietnamese Constituency Parsing with Pre-training

In this work, we use a span-based approach for Vietnamese constituency p...
research
05/27/2019

Compositional pre-training for neural semantic parsing

Semantic parsing is the process of translating natural language utteranc...
research
10/04/2022

Guiding the PLMs with Semantic Anchors as Intermediate Supervision: Towards Interpretable Semantic Parsing

The recent prevalence of pretrained language models (PLMs) has dramatica...
research
08/16/2022

Mask and Reason: Pre-Training Knowledge Graph Transformers for Complex Logical Queries

Knowledge graph (KG) embeddings have been a mainstream approach for reas...

Please sign up or login with your details

Forgot password? Click here to reset