RotLSTM: Rotating Memories in Recurrent Neural Networks

05/01/2021
by   Vlad Velici, et al.
0

Long Short-Term Memory (LSTM) units have the ability to memorise and use long-term dependencies between inputs to generate predictions on time series data. We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights. This addition shows significant increases of performance on some of the tasks from the bAbI dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset