Structured Compression and Sharing of Representational Space for Continual Learning

01/23/2020
by   Gobinda Saha, et al.
9

Humans are skilled at learning adaptively and efficiently throughout their lives, but learning tasks incrementally causes artificial neural networks to overwrite relevant information learned about older tasks, resulting in 'Catastrophic Forgetting'. Efforts to overcome this phenomenon suffer from poor utilization of resources in many ways, such as through the need to save older data or parametric importance scores, or to grow the network architecture. We propose an algorithm that enables a network to learn continually and efficiently by partitioning the representational space into a Core space, that contains the condensed information from previously learned tasks, and a Residual space, which is akin to a scratch space for learning the current task. The information in the Residual space is then compressed using Principal Component Analysis and added to the Core space, freeing up parameters for the next task. We evaluate our algorithm on P-MNIST, CIFAR-10 and CIFAR-100 datasets. We achieve comparable accuracy to state-of-the-art methods while overcoming the problem of catastrophic forgetting completely. Additionally, we get up to 4.5x improvement in energy efficiency during inference due to the structured nature of the resulting architecture.

READ FULL TEXT
research
10/09/2019

Continual Learning Using Bayesian Neural Networks

Continual learning models allow to learn and adapt to new changes and ta...
research
06/22/2019

Beneficial perturbation network for continual learning

Sequential learning of multiple tasks in artificial neural networks usin...
research
04/24/2020

Dropout as an Implicit Gating Mechanism For Continual Learning

In recent years, neural networks have demonstrated an outstanding abilit...
research
05/16/2023

CQural: A Novel CNN based Hybrid Architecture for Quantum Continual Machine Learning

Training machine learning models in an incremental fashion is not only i...
research
10/07/2020

A Theoretical Analysis of Catastrophic Forgetting through the NTK Overlap Matrix

Continual learning (CL) is a setting in which an agent has to learn from...
research
06/15/2021

Bridge Networks

Despite rapid progress, current deep learning methods face a number of c...
research
02/12/2018

Pseudo-Recursal: Solving the Catastrophic Forgetting Problem in Deep Neural Networks

In general, neural networks are not currently capable of learning tasks ...

Please sign up or login with your details

Forgot password? Click here to reset