Continual Learning with Echo State Networks

by   Andrea Cossu, et al.

Continual Learning (CL) refers to a learning setup where data is non stationary and the model has to learn without forgetting existing knowledge. The study of CL for sequential patterns revolves around trained recurrent networks. In this work, instead, we introduce CL in the context of Echo State Networks (ESNs), where the recurrent component is kept fixed. We provide the first evaluation of catastrophic forgetting in ESNs and we highlight the benefits in using CL strategies which are not applicable to trained recurrent models. Our results confirm the ESN as a promising model for CL and open to its use in streaming scenarios.


page 1

page 2

page 3

page 4


Wide Neural Networks Forget Less Catastrophically

A growing body of research in continual learning is devoted to overcomin...

Rehearsal revealed: The limits and merits of revisiting samples in continual learning

Learning from non-stationary data streams and overcoming catastrophic fo...

Graph-Based Continual Learning

Despite significant advances, continual learning models still suffer fro...

Continual Learning for Image-Based Camera Localization

For several emerging technologies such as augmented reality, autonomous ...

CSG0: Continual Urban Scene Generation with Zero Forgetting

With the rapid advances in generative adversarial networks (GANs), the v...

Incremental Learning from Low-labelled Stream Data in Open-Set Video Face Recognition

Deep Learning approaches have brought solutions, with impressive perform...

RATT: Recurrent Attention to Transient Tasks for Continual Image Captioning

Research on continual learning has led to a variety of approaches to mit...

Please sign up or login with your details

Forgot password? Click here to reset