Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints

06/05/2023
by   Chao Lou, et al.
0

Neural QCFG is a grammar-based sequence-tosequence (seq2seq) model with strong inductive biases on hierarchical structures. It excels in interpretability and generalization but suffers from expensive inference. In this paper, we study two low-rank variants of Neural QCFG for faster inference with different trade-offs between efficiency and expressiveness. Furthermore, utilizing the symbolic interface provided by the grammar, we introduce two soft constraints over tree hierarchy and source coverage. We experiment with various datasets and find that our models outperform vanilla Neural QCFG in most settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2022

Hierarchical Phrase-based Sequence-to-Sequence Learning

We describe a neural transducer that maintains the flexibility of standa...
research
09/02/2021

Sequence-to-Sequence Learning with Latent Neural Grammars

Sequence-to-sequence learning with neural networks has become the de fac...
research
09/06/2018

Evaluating Syntactic Properties of Seq2seq Output with a Broad Coverage HPSG: A Case Study on Machine Translation

Sequence to sequence (seq2seq) models are often employed in settings whe...
research
05/25/2017

Neural Attribute Machines for Program Generation

Recurrent neural networks have achieved remarkable success at generating...
research
08/29/2018

Grammar Induction with Neural Language Models: An Unusual Replication

A substantial thread of recent work on latent tree learning has attempte...
research
06/06/2021

Structured Reordering for Modeling Latent Alignments in Sequence Transduction

Despite success in many domains, neural models struggle in settings wher...
research
10/31/2018

An Interdisciplinary Comparison of Sequence Modeling Methods for Next-Element Prediction

Data of sequential nature arise in many application domains in forms of,...

Please sign up or login with your details

Forgot password? Click here to reset