Continually Learning Self-Supervised Representations with Projected Functional Regularization

12/30/2021
by   Alex Gomez-Villa, et al.
0

Recent self-supervised learning methods are able to learn high-quality image representations and are closing the gap with supervised methods. However, these methods are unable to acquire new knowledge incrementally – they are, in fact, mostly used only as a pre-training phase with IID data. In this work we investigate self-supervised methods in continual learning regimes without additional memory or replay. To prevent forgetting of previous knowledge, we propose the usage of functional regularization. We will show that naive functional regularization, also known as feature distillation, leads to low plasticity and therefore seriously limits continual learning performance. To address this problem, we propose Projected Functional Regularization where a separate projection network ensures that the newly learned feature space preserves information of the previous feature space, while allowing for the learning of new features. This allows us to prevent forgetting while maintaining the plasticity of the learner. Evaluation against other incremental learning approaches applied to self-supervision demonstrates that our method obtains competitive performance in different scenarios and on multiple datasets.

READ FULL TEXT
research
03/25/2021

Self-Supervised Training Enhances Online Continual Learning

In continual learning, a system must incrementally learn from a non-stat...
research
07/11/2022

Consistency is the key to further mitigating catastrophic forgetting in continual learning

Deep neural networks struggle to continually learn multiple sequential t...
research
06/10/2020

Self-Supervised Learning Aided Class-Incremental Lifelong Learning

Lifelong or continual learning remains to be a challenge for artificial ...
research
09/12/2023

Plasticity-Optimized Complementary Networks for Unsupervised Continual Learning

Continuous unsupervised representation learning (CURL) research has grea...
research
07/26/2021

Continual-wav2vec2: an Application of Continual Learning for Self-Supervised Automatic Speech Recognition

We present a method for continual learning of speech representations for...
research
03/16/2023

CSSL-MHTR: Continual Self-Supervised Learning for Scalable Multi-script Handwritten Text Recognition

Self-supervised learning has recently emerged as a strong alternative in...
research
10/27/2018

Self-Supervised GAN to Counter Forgetting

GANs involve training two networks in an adversarial game, where each ne...

Please sign up or login with your details

Forgot password? Click here to reset