Sentence-State LSTM for Text Representation

05/07/2018
by   Yue Zhang, et al.
0

Bi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/29/2020

Depth-Adaptive Graph Recurrent Network for Text Classification

The Sentence-State LSTM (S-LSTM) is a powerful and high efficient graph ...
research
03/20/2019

Neural Speed Reading with Structural-Jump-LSTM

Recurrent neural networks (RNNs) can model natural language by sequentia...
research
10/10/2016

Neural Paraphrase Generation with Stacked Residual LSTM Networks

In this paper, we propose a novel neural approach for paraphrase generat...
research
05/05/2018

Chinese NER Using Lattice LSTM

We investigate a lattice-structured LSTM model for Chinese NER, which en...
research
11/21/2016

Bidirectional Tree-Structured LSTM with Head Lexicalization

Sequential LSTM has been extended to model tree structures, giving compe...
research
11/06/2022

Suffix Retrieval-Augmented Language Modeling

Causal language modeling (LM) uses word history to predict the next word...
research
09/18/2019

Recursive Graphical Neural Networks for Text Classification

The complicated syntax structure of natural language is hard to be expli...

Please sign up or login with your details

Forgot password? Click here to reset