Reservoir Stack Machines

05/04/2021
by   Benjamin Paaßen, et al.
0

Memory-augmented neural networks equip a recurrent neural network with an explicit memory to support tasks that require information storage without interference over long times. A key motivation for such research is to perform classic computation tasks, such as parsing. However, memory-augmented neural networks are notoriously hard to train, requiring many backpropagation epochs and a lot of data. In this paper, we introduce the reservoir stack machine, a model which can provably recognize all deterministic context-free languages and circumvents the training problem by training only the output layer of a recurrent net and employing auxiliary information during training about the desired interaction with a stack. In our experiments, we validate the reservoir stack machine against deep and shallow networks from the literature on three benchmark tasks for Neural Turing machines and six deterministic context-free languages. Our results show that the reservoir stack machine achieves zero error, even on test sequences longer than the training data, requiring only a few seconds of training time and 100 training sequences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2018

Context-Free Transductions with Neural Stacks

This paper analyzes the behavior of stack-augmented recurrent neural net...
research
09/14/2020

Reservoir Memory Machines as Neural Computers

Differentiable neural computers extend artificial neural networks with a...
research
11/08/2019

Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages

We introduce three memory-augmented Recurrent Neural Networks (MARNNs) a...
research
11/15/2017

The Neural Network Pushdown Automaton: Model, Stack and Learning Simulations

In order for neural networks to learn complex languages or grammars, the...
research
10/04/2022

The Surprising Computational Power of Nondeterministic Stack RNNs

Traditional recurrent neural networks (RNNs) have a fixed, finite number...
research
03/25/2023

Complexity-calibrated Benchmarks for Machine Learning Reveal When Next-Generation Reservoir Computer Predictions Succeed and Mislead

Recurrent neural networks are used to forecast time series in finance, c...
research
07/01/2019

Understanding Memory Modules on Learning Simple Algorithms

Recent work has shown that memory modules are crucial for the generaliza...

Please sign up or login with your details

Forgot password? Click here to reset