Continual learning under domain transfer with sparse synaptic bursting

08/26/2021
by   Shawn L. Beaulieu, et al.
3

Existing machines are functionally specific tools that were made for easy prediction and control. Tomorrow's machines may be closer to biological systems in their mutability, resilience, and autonomy. But first they must be capable of learning, and retaining, new information without repeated exposure to it. Past efforts to engineer such systems have sought to build or regulate artificial neural networks using task-specific modules with constrained circumstances of application. This has not yet enabled continual learning over long sequences of previously unseen data without corrupting existing knowledge: a problem known as catastrophic forgetting. In this paper, we introduce a system that can learn sequentially over previously unseen datasets (ImageNet, CIFAR-100) with little forgetting over time. This is accomplished by regulating the activity of weights in a convolutional neural network on the basis of inputs using top-down modulation generated by a second feed-forward neural network. We find that our method learns continually under domain transfer with sparse bursts of activity in weights that are recycled across tasks, rather than by maintaining task-specific modules. Sparse synaptic bursting is found to balance enhanced and diminished activity in a way that facilitates adaptation to new inputs without corrupting previously acquired functions. This behavior emerges during a prior meta-learning phase in which regulated synapses are selectively disinhibited, or grown, from an initial state of uniform suppression.

READ FULL TEXT

page 15

page 16

page 17

page 18

page 19

page 20

page 23

page 24

research
02/20/2018

Continual Reinforcement Learning with Complex Synapses

Unlike humans, who are capable of continual learning over their lifetime...
research
04/08/2022

Learning to modulate random weights can induce task-specific contexts for economical meta and continual learning

Neural networks are vulnerable to catastrophic forgetting when data dist...
research
12/08/2022

Bio-Inspired, Task-Free Continual Learning through Activity Regularization

The ability to sequentially learn multiple tasks without forgetting is a...
research
09/09/2022

Continual learning benefits from multiple sleep mechanisms: NREM, REM, and Synaptic Downscaling

Learning new tasks and skills in succession without losing prior learnin...
research
11/14/2022

Hierarchically Structured Task-Agnostic Continual Learning

One notable weakness of current machine learning algorithms is the poor ...
research
07/18/2023

HAT-CL: A Hard-Attention-to-the-Task PyTorch Library for Continual Learning

Catastrophic forgetting, the phenomenon in which a neural network loses ...
research
06/30/2020

Enabling Continual Learning with Differentiable Hebbian Plasticity

Continual learning is the problem of sequentially learning new tasks or ...

Please sign up or login with your details

Forgot password? Click here to reset