Automatic Recall Machines: Internal Replay, Continual Learning and the Brain

06/22/2020
by   Xu Ji, et al.
0

Replay in neural networks involves training on sequential data with memorized samples, which counteracts forgetting of previous behavior caused by non-stationarity. We present a method where these auxiliary samples are generated on the fly, given only the model that is being trained for the assessed objective, without extraneous buffers or generator networks. Instead the implicit memory of learned samples within the assessed model itself is exploited. Furthermore, whereas existing work focuses on reinforcing the full seen data distribution, we show that optimizing for not forgetting calls for the generation of samples that are specialized to each real training batch, which is more efficient and scalable. We consider high-level parallels with the brain, notably the use of a single model for inference and recall, the dependency of recalled samples on the current environment batch, top-down modulation of activations and learning, abstract recall, and the dependency between the degree to which a task is learned and the degree to which it is recalled. These characteristics emerge naturally from the method without being controlled for.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2021

Prototypes-Guided Memory Replay for Continual Learning

Continual learning (CL) refers to a machine learning paradigm that using...
research
08/20/2022

A Multi-Head Model for Continual Learning via Out-of-Distribution Replay

This paper studies class incremental learning (CIL) of continual learnin...
research
03/23/2023

Adiabatic replay for continual learning

Conventional replay-based approaches to continual learning (CL) require,...
research
04/03/2023

Knowledge Accumulation in Continually Learned Representations and the Issue of Feature Forgetting

By default, neural networks learn on all training data at once. When suc...
research
09/19/2023

RECALL+: Adversarial Web-based Replay for Continual Learning in Semantic Segmentation

Catastrophic forgetting of previous knowledge is a critical issue in con...
research
10/26/2021

Brain-inspired feature exaggeration in generative replay for continual learning

The catastrophic forgetting of previously learnt classes is one of the m...

Please sign up or login with your details

Forgot password? Click here to reset