Sliced Recurrent Neural Networks

07/06/2018
by   Zeping Yu, et al.
0

Recurrent neural networks have achieved great success in many NLP tasks. However, they have difficulty in parallelization because of the recurrent structure, so it takes much time to train RNNs. In this paper, we introduce sliced recurrent neural networks (SRNNs), which could be parallelized by slicing the sequences into many subsequences. SRNNs have the ability to obtain high-level information through multiple layers with few extra parameters. We prove that the standard RNN is a special case of the SRNN when we use linear activation functions. Without changing the recurrent units, SRNNs are 136 times as fast as standard RNNs and could be even faster when we train longer sequences. Experiments on six largescale sentiment analysis datasets show that SRNNs achieve better performance than standard RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2017

Network of Recurrent Neural Networks

We describe a class of systems theory based neural networks called "Netw...
research
03/11/2023

Resurrecting Recurrent Neural Networks for Long Sequences

Recurrent Neural Networks (RNNs) offer fast inference on long sequences ...
research
11/17/2019

Multi-Zone Unit for Recurrent Neural Networks

Recurrent neural networks (RNNs) have been widely used to deal with sequ...
research
10/08/2020

A Fully Tensorized Recurrent Neural Network

Recurrent neural networks (RNNs) are powerful tools for sequential model...
research
05/31/2021

Learning and Generalization in RNNs

Simple recurrent neural networks (RNNs) and their more advanced cousins ...
research
06/07/2019

Recurrent Kernel Networks

Substring kernels are classical tools for representing biological sequen...
research
08/06/2021

Path classification by stochastic linear recurrent neural networks

We investigate the functioning of a classifying biological neural networ...

Please sign up or login with your details

Forgot password? Click here to reset