Gated Orthogonal Recurrent Units: On Learning to Forget

06/08/2017
by   Li Jing, et al.
0

We present a novel recurrent neural network (RNN) based model that combines the remembering ability of unitary RNNs with the ability of gated RNNs to effectively forget redundant/irrelevant information in its memory. We achieve this by extending unitary RNNs with a gating mechanism. Our model is able to outperform LSTMs, GRUs and Unitary RNNs on several long-term dependency benchmark tasks. We empirically both show the orthogonal/unitary RNNs lack the ability to forget and also the ability of GORU to simultaneously remember long term dependencies while forgetting irrelevant information. This plays an important role in recurrent neural networks. We provide competitive results along with an analysis of our model on many natural sequential tasks including the bAbI Question Answering, TIMIT speech spectrum prediction, Penn TreeBank, and synthetic tasks that involve long-term dependencies such as algorithmic, parenthesis, denoising and copying tasks.

READ FULL TEXT
research
12/29/2017

Recent Advances in Recurrent Neural Networks

Recurrent neural networks (RNNs) are capable of learning features and lo...
research
02/22/2016

Recurrent Orthogonal Networks and Long-Memory Tasks

Although RNNs have been shown to be powerful tools for processing sequen...
research
06/02/2018

A Novel Framework for Recurrent Neural Networks with Enhancing Information Processing and Transmission between Units

This paper proposes a novel framework for recurrent neural networks (RNN...
research
03/23/2020

Depth Enables Long-Term Memory for Recurrent Neural Networks

A key attribute that drives the unprecedented success of modern Recurren...
research
04/29/2019

Learning Longer-term Dependencies via Grouped Distributor Unit

Learning long-term dependencies still remains difficult for recurrent ne...
research
10/25/2017

Benefits of Depth for Long-Term Memory of Recurrent Networks

The key attribute that drives the unprecedented success of modern Recurr...
research
04/30/2016

Higher Order Recurrent Neural Networks

In this paper, we study novel neural network structures to better model ...

Please sign up or login with your details

Forgot password? Click here to reset