Lifelong Learning with Sketched Structural Regularization

04/17/2021
by   Haoran Li, et al.
9

Preventing catastrophic forgetting while continually learning new tasks is an essential problem in lifelong learning. Structural regularization (SR) refers to a family of algorithms that mitigate catastrophic forgetting by penalizing the network for changing its "critical parameters" from previous tasks while learning a new one. The penalty is often induced via a quadratic regularizer defined by an importance matrix, e.g., the (empirical) Fisher information matrix in the Elastic Weight Consolidation framework. In practice and due to computational constraints, most SR methods crudely approximate the importance matrix by its diagonal. In this paper, we propose Sketched Structural Regularization (Sketched SR) as an alternative approach to compress the importance matrices used for regularizing in SR methods. Specifically, we apply linear sketching methods to better approximate the importance matrices in SR algorithms. We show that sketched SR: (i) is computationally efficient and straightforward to implement, (ii) provides an approximation error that is justified in theory, and (iii) is method oblivious by construction and can be adapted to any method that belongs to the structural regularization class. We show that our proposed approach consistently improves various SR algorithms' performance on both synthetic experiments and benchmark continual learning tasks, including permuted-MNIST and CIFAR-100.

READ FULL TEXT

page 8

page 17

page 18

page 19

research
12/09/2021

Provable Continual Learning via Sketched Jacobian Approximations

An important problem in machine learning is the ability to learn tasks i...
research
02/08/2018

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

In this paper we propose an approach to avoiding catastrophic forgetting...
research
10/03/2022

Efficient Meta-Learning for Continual Learning with Taylor Expansion Approximation

Continual learning aims to alleviate catastrophic forgetting when handli...
research
11/03/2021

The effect of synaptic weight initialization in feature-based successor representation learning

After discovering place cells, the idea of the hippocampal (HPC) functio...
research
12/03/2019

Overcoming Catastrophic Forgetting by Generative Regularization

In this paper, we propose a new method to overcome catastrophic forgetti...
research
02/04/2021

Rethinking Quadratic Regularizers: Explicit Movement Regularization for Continual Learning

Quadratic regularizers are often used for mitigating catastrophic forget...
research
06/11/2020

Understanding Regularisation Methods for Continual Learning

The problem of Catastrophic Forgetting has received a lot of attention i...

Please sign up or login with your details

Forgot password? Click here to reset