A theory of sequence indexing and working memory in recurrent neural networks

02/28/2018
by   E. Paxon Frady, et al.
0

To accommodate structured approaches of neural computation, we propose a class of recurrent neural networks for indexing and storing sequences of symbols or analog data vectors. These networks with randomized input weights and orthogonal recurrent weights implement coding principles previously described in vector symbolic architectures (VSA), and leverage properties of reservoir computing. In general, the storage in reservoir computing is lossy and crosstalk noise limits the retrieval accuracy and information capacity. A novel theory to optimize memory performance in such networks is presented and compared with simulation experiments. The theory describes linear readout of analog data, and readout with winner-take-all error correction of symbolic data as proposed in VSA models. We find that diverse VSA models from the literature have universal performance properties, which are superior to what previous analyses predicted. Further, we propose novel VSA models with the statistically optimal Wiener filter in the readout that exhibit much higher information capacity, in particular for storing analog data. The presented theory also applies to memory buffers, networks with gradual forgetting, which can operate on infinite data streams without memory overflow. Interestingly, we find that different forgetting mechanisms, such as attenuating recurrent weights or neural nonlinearities, produce very similar behavior if the forgetting time constants are aligned. Such models exhibit extensive capacity when their forgetting time constant is optimized for given noise conditions and network size. These results enable the design of new types of VSA models for the online processing of data streams.

READ FULL TEXT
research
07/05/2017

Theory of the superposition principle for randomized connectionist representations in neural networks

To understand cognitive reasoning in the brain, it has been proposed tha...
research
08/21/2023

Simple Cycle Reservoirs are Universal

Reservoir computation models form a subclass of recurrent neural network...
research
10/07/2020

Cellular Automata Can Reduce Memory Requirements of Collective-State Computing

Various non-classical approaches of distributed information processing, ...
research
04/25/2014

Input anticipating critical reservoirs show power law forgetting of unexpected input events

Usually, reservoir computing shows an exponential memory decay. This pap...
research
04/05/2022

Neural Computing with Coherent Laser Networks

We show that a coherent network of lasers exhibits emergent neural compu...
research
08/08/2022

Towards lifelong learning of Recurrent Neural Networks for control design

This paper proposes a method for lifelong learning of Recurrent Neural N...
research
06/11/2019

Dynamical Anatomy of NARMA10 Benchmark Task

The emulation task of a nonlinear autoregressive moving average model, i...

Please sign up or login with your details

Forgot password? Click here to reset