Understanding LSTM – a tutorial into Long Short-Term Memory Recurrent Neural Networks

09/12/2019
by   Ralf C. Staudemeyer, et al.
0

Long Short-Term Memory Recurrent Neural Networks (LSTM-RNN) are one of the most powerful dynamic classifiers publicly known. The network itself and the related learning algorithms are reasonably well documented to get an idea how it works. This paper will shed more light into understanding how LSTM-RNNs evolved and why they work impressively well, focusing on the early, ground-breaking publications. We significantly improved documentation and fixed a number of errors and inconsistencies that accumulated in previous publications. To support understanding we as well revised and unified the notation used.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2018

On Extended Long Short-term Memory and Dependent Bidirectional Recurrent Neural Network

In this work, we investigate the memory capability of recurrent neural n...
research
04/28/2020

How Chaotic Are Recurrent Neural Networks?

Recurrent neural networks (RNNs) are non-linear dynamic systems. Previou...
research
09/03/2020

Quantum Long Short-Term Memory

Long short-term memory (LSTM) is a kind of recurrent neural networks (RN...
research
10/09/2020

Recurrent babbling: evaluating the acquisition of grammar from limited input data

Recurrent Neural Networks (RNNs) have been shown to capture various aspe...
research
05/24/2022

Realization Theory Of Recurrent Neural ODEs Using Polynomial System Embeddings

In this paper we show that neural ODE analogs of recurrent (ODE-RNN) and...
research
02/01/2020

Model Extraction Attacks against Recurrent Neural Networks

Model extraction attacks are a kind of attacks in which an adversary obt...
research
10/10/2022

Predicting Blossom Date of Cherry Tree With Support Vector Machine and Recurrent Neural Network

Our project probes the relationship between temperatures and the blossom...

Please sign up or login with your details

Forgot password? Click here to reset