Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

01/12/2017
by   Yuzhen Lu, et al.
0

The standard LSTM recurrent neural networks while very powerful in long-range dependency sequence applications have highly complex structure and relatively large (adaptive) parameters. In this work, we present empirical comparison between the standard LSTM recurrent neural network architecture and three new parameter-reduced variants obtained by eliminating combinations of the input signal, bias, and hidden unit signals from individual gating signals. The experiments on two sequence datasets show that the three new variants, called simply as LSTM1, LSTM2, and LSTM3, can achieve comparable performance to the standard LSTM model with less (adaptive) parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2016

Empirical Evaluation of A New Approach to Simplifying Long Short-term Memory (LSTM)

The standard LSTM, although it succeeds in the modeling long-range depen...
research
09/26/2016

Multiplicative LSTM for sequence modelling

We introduce multiplicative LSTM (mLSTM), a recurrent neural network arc...
research
06/23/2016

LSTMVis: A Tool for Visual Analysis of Hidden State Dynamics in Recurrent Neural Networks

Recurrent neural networks, and in particular long short-term memory (LST...
research
02/19/2023

Estimation and Early Prediction of Grip Force Based on sEMG Signals and Deep Recurrent Neural Networks

Hands are used for communicating with the surrounding environment and ha...
research
12/29/2018

SLIM LSTMs

Long Short-Term Memory (LSTM) Recurrent Neural networks (RNNs) rely on g...
research
10/21/2018

Sleep Arousal Detection from Polysomnography using the Scattering Transform and Recurrent Neural Networks

Sleep disorders are implicated in a growing number of health problems. I...
research
03/13/2015

LSTM: A Search Space Odyssey

Several variants of the Long Short-Term Memory (LSTM) architecture for r...

Please sign up or login with your details

Forgot password? Click here to reset