EEC: Learning to Encode and Regenerate Images for Continual Learning

by   Ali Ayub, et al.

The two main impediments to continual learning are catastrophic forgetting and memory limitations on the storage of data. To cope with these challenges, we propose a novel, cognitively-inspired approach which trains autoencoders with Neural Style Transfer to encode and store images. During training on a new task, reconstructed images from encoded episodes are replayed in order to avoid catastrophic forgetting. The loss function for the reconstructed images is weighted to reduce its effect during classifier training to cope with image degradation. When the system runs out of memory the encoded episodes are converted into centroids and covariance matrices, which are used to generate pseudo-images during classifier training, keeping classifier performance stable while using less memory. Our approach increases classification accuracy by 13-17 less storage space.


page 4

page 8


Saliency-Augmented Memory Completion for Continual Learning

Continual Learning is considered a key step toward next-generation Artif...

Online Continual Learning Without the Storage Constraint

Online continual learning (OCL) research has primarily focused on mitiga...

Condensed Composite Memory Continual Learning

Deep Neural Networks (DNNs) suffer from a rapid decrease in performance ...

ATLAS: Universal Function Approximator for Memory Retention

Artificial neural networks (ANNs), despite their universal function appr...

Studying Generalization on Memory-Based Methods in Continual Learning

One of the objectives of Continual Learning is to learn new concepts con...

KASAM: Spline Additive Models for Function Approximation

Neural networks have been criticised for their inability to perform cont...

Carousel Memory: Rethinking the Design of Episodic Memory for Continual Learning

Continual Learning (CL) is an emerging machine learning paradigm that ai...

Please sign up or login with your details

Forgot password? Click here to reset