AFEC: Active Forgetting of Negative Transfer in Continual Learning

10/23/2021
by   Liyuan Wang, et al.
0

Continual learning aims to learn a sequence of tasks from dynamic data distributions. Without accessing to the old training samples, knowledge transfer from the old tasks to each new task is difficult to determine, which might be either positive or negative. If the old knowledge interferes with the learning of a new task, i.e., the forward knowledge transfer is negative, then precisely remembering the old tasks will further aggravate the interference, thus decreasing the performance of continual learning. By contrast, biological neural networks can actively forget the old knowledge that conflicts with the learning of a new experience, through regulating the learning-triggered synaptic expansion and synaptic convergence. Inspired by the biological active forgetting, we propose to actively forget the old knowledge that limits the learning of new tasks to benefit continual learning. Under the framework of Bayesian continual learning, we develop a novel approach named Active Forgetting with synaptic Expansion-Convergence (AFEC). Our method dynamically expands parameters to learn each new task and then selectively combines them, which is formally consistent with the underlying mechanism of biological active forgetting. We extensively evaluate AFEC on a variety of continual learning benchmarks, including CIFAR-10 regression tasks, visual classification tasks and Atari reinforcement tasks, where AFEC effectively improves the learning of new tasks and achieves the state-of-the-art performance in a plug-and-play way.

READ FULL TEXT

page 5

page 8

research
03/13/2017

Continual Learning Through Synaptic Intelligence

While deep learning has led to remarkable advances across diverse applic...
research
04/05/2019

Learning to Remember: A Synaptic Plasticity Driven Framework for Continual Learning

Models trained in the context of continual learning (CL) should be able ...
research
05/06/2023

Active Continual Learning: Labelling Queries in a Sequence of Tasks

Acquiring new knowledge without forgetting what has been learned in a se...
research
09/29/2022

RECALL: Rehearsal-free Continual Learning for Object Classification

Convolutional neural networks show remarkable results in classification ...
research
03/20/2022

Continual Sequence Generation with Adaptive Compositional Modules

Continual learning is essential for real-world deployment when there is ...
research
12/03/2021

Learning Curves for Sequential Training of Neural Networks: Self-Knowledge Transfer and Forgetting

Sequential training from task to task is becoming one of the major objec...
research
05/16/2018

Progress & Compress: A scalable framework for continual learning

We introduce a conceptually simple and scalable framework for continual ...

Please sign up or login with your details

Forgot password? Click here to reset