DILF-EN framework for Class-Incremental Learning

by   Mohammed Asad Karim, et al.

Deep learning models suffer from catastrophic forgetting of the classes in the older phases as they get trained on the classes introduced in the new phase in the class-incremental learning setting. In this work, we show that the effect of catastrophic forgetting on the model prediction varies with the change in orientation of the same image, which is a novel finding. Based on this, we propose a novel data-ensemble approach that combines the predictions for the different orientations of the image to help the model retain further information regarding the previously seen classes and thereby reduce the effect of forgetting on the model predictions. However, we cannot directly use the data-ensemble approach if the model is trained using traditional techniques. Therefore, we also propose a novel dual-incremental learning framework that involves jointly training the network with two incremental learning objectives, i.e., the class-incremental learning objective and our proposed data-incremental learning objective. In the dual-incremental learning framework, each image belongs to two classes, i.e., the image class (for class-incremental learning) and the orientation class (for data-incremental learning). In class-incremental learning, each new phase introduces a new set of classes, and the model cannot access the complete training data from the older phases. In our proposed data-incremental learning, the orientation classes remain the same across all the phases, and the data introduced by the new phase in class-incremental learning acts as new training data for these orientation classes. We empirically demonstrate that the dual-incremental learning framework is vital to the data-ensemble approach. We apply our proposed approach to state-of-the-art class-incremental learning methods and empirically show that our framework significantly improves the performance of these methods.


page 5

page 11


Class-incremental Learning via Deep Model Consolidation

Deep neural networks (DNNs) often suffer from "catastrophic forgetting" ...

Class Impression for Data-free Incremental Learning

Standard deep learning-based classification approaches require collectin...

Mimicking the Oracle: An Initial Phase Decorrelation Approach for Class Incremental Learning

Class Incremental Learning (CIL) aims at learning a multi-class classifi...

ACIL: Analytic Class-Incremental Learning with Absolute Memorization and Privacy Protection

Class-incremental learning (CIL) learns a classification model with trai...

Alleviate Representation Overlapping in Class Incremental Learning by Contrastive Class Concentration

The challenge of the Class Incremental Learning (CIL) lies in difficulty...

Deep Incremental Boosting

This paper introduces Deep Incremental Boosting, a new technique derived...

Attaining Class-level Forgetting in Pretrained Model using Few Samples

In order to address real-world problems, deep learning models are jointl...

Please sign up or login with your details

Forgot password? Click here to reset