Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision

09/03/2021
by   Zhong Ji, et al.
0

General Continual Learning (GCL) aims at learning from non independent and identically distributed stream data without catastrophic forgetting of the old tasks that don't rely on task boundaries during both training and testing stages. We reveal that the relation and feature deviations are crucial problems for catastrophic forgetting, in which relation deviation refers to the deficiency of the relationship among all classes in knowledge distillation, and feature deviation refers to indiscriminative feature representations. To this end, we propose a Complementary Calibration (CoCa) framework by mining the complementary model's outputs and features to alleviate the two deviations in the process of GCL. Specifically, we propose a new collaborative distillation approach for addressing the relation deviation. It distills model's outputs by utilizing ensemble dark knowledge of new model's outputs and reserved outputs, which maintains the performance of old tasks as well as balancing the relationship among all classes. Furthermore, we explore a collaborative self-supervision idea to leverage pretext tasks and supervised contrastive learning for addressing the feature deviation problem by learning complete and discriminative features for all classes. Extensive experiments on four popular datasets show that our CoCa framework achieves superior performance against state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 8

research
03/26/2023

Preserving Linear Separability in Continual Learning by Backward Feature Projection

Catastrophic forgetting has been a major challenge in continual learning...
research
08/11/2021

Discriminative Distillation to Reduce Class Confusion in Continual Learning

Successful continual learning of new knowledge would enable intelligent ...
research
08/01/2023

Online Prototype Learning for Online Continual Learning

Online continual learning (CL) studies the problem of learning continuou...
research
07/22/2023

Revisiting Distillation for Continual Learning on Visual Question Localized-Answering in Robotic Surgery

The visual-question localized-answering (VQLA) system can serve as a kno...
research
07/24/2022

Online Continual Learning with Contrastive Vision Transformer

Online continual learning (online CL) studies the problem of learning se...
research
10/24/2019

Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning

Human beings are able to master a variety of knowledge and skills with o...
research
03/15/2022

SATS: Self-Attention Transfer for Continual Semantic Segmentation

Continually learning to segment more and more types of image regions is ...

Please sign up or login with your details

Forgot password? Click here to reset