Neural Tree Indexers for Text Understanding

07/15/2016
by   Tsendsuren Munkhdalai, et al.
0

Recurrent neural networks (RNNs) process input text sequentially and model the conditional transition between word tokens. In contrast, the advantages of recursive networks include that they explicitly model the compositionality and the recursive structure of natural language. However, the current recursive architecture is limited by its dependence on syntactic tree. In this paper, we introduce a robust syntactic parsing-independent tree structured model, Neural Tree Indexers (NTI) that provides a middle ground between the sequential RNNs and the syntactic treebased recursive models. NTI constructs a full n-ary tree by processing the input text with its node function in a bottom-up fashion. Attention mechanism can then be applied to both structure and node function. We implemented and evaluated a binarytree model of NTI, showing the model achieved the state-of-the-art performance on three different NLP tasks: natural language inference, answer sentence selection, and sentence classification, outperforming state-of-the-art recurrent and recursive neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2018

Dynamic Compositionality in Recursive Neural Networks with Structure-aware Tag Representations

Most existing recursive neural network (RvNN) architectures utilize only...
research
01/07/2017

Structural Attention Neural Networks for improved sentiment analysis

We introduce a tree-structured attention neural network for sentences an...
research
12/02/2013

Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure

Recently, deep architectures, such as recurrent and recursive neural net...
research
01/06/2021

Can RNNs learn Recursive Nested Subject-Verb Agreements?

One of the fundamental principles of contemporary linguistics states tha...
research
08/30/2018

Iterative Recursive Attention Model for Interpretable Sequence Classification

Natural language processing has greatly benefited from the introduction ...
research
10/02/2017

Attentive Convolution

In NLP, convolution neural networks (CNNs) have benefited less than recu...
research
02/28/2015

When Are Tree Structures Necessary for Deep Learning of Representations?

Recursive neural models, which use syntactic parse trees to recursively ...

Please sign up or login with your details

Forgot password? Click here to reset