Learning Simpler Language Models with the Differential State Framework

03/26/2017
by   Alexander G. Ororbia II, et al.
0

Learning useful information across long time lags is a critical and difficult problem for temporal neural models in tasks such as language modeling. Existing architectures that address the issue are often complex and costly to train. The Differential State Framework (DSF) is a simple and high-performing design that unifies previously introduced gated neural models. DSF models maintain longer-term memory by learning to interpolate between a fast-changing data-driven representation and a slowly changing, implicitly stable state. This requires hardly any more parameters than a classical, simple recurrent network. Within the DSF framework, a new architecture is presented, the Delta-RNN. In language modeling at the word and character levels, the Delta-RNN outperforms popular complex architectures, such as the Long Short Term Memory (LSTM) and the Gated Recurrent Unit (GRU), and, when regularized, performs comparably to several state-of-the-art baselines. At the subword level, the Delta-RNN's performance is comparable to that of complex gated architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2017

Gated Recurrent Neural Tensor Network

Recurrent Neural Networks (RNNs), which are a powerful scheme for modeli...
research
05/06/2016

LSTM with Working Memory

Previous RNN architectures have largely been superseded by LSTM, or "Lon...
research
12/20/2017

A Flexible Approach to Automated RNN Architecture Generation

The process of designing neural architectures requires expert knowledge ...
research
01/06/2016

Recurrent Memory Networks for Language Modeling

Recurrent Neural Networks (RNN) have obtained excellent result in many n...
research
01/04/2021

High-bandwidth nonlinear control for soft actuators with recursive network models

We present a high-bandwidth, lightweight, and nonlinear output tracking ...
research
08/28/2018

Rational Recurrences

Despite the tremendous empirical success of neural models in natural lan...
research
05/21/2017

Recurrent Additive Networks

We introduce recurrent additive networks (RANs), a new gated RNN which i...

Please sign up or login with your details

Forgot password? Click here to reset