Looking through the past: better knowledge retention for generative replay in continual learning

09/18/2023
by   Valeriya Khan, et al.
0

In this work, we improve the generative replay in a continual learning setting to perform well on challenging scenarios. Current generative rehearsal methods are usually benchmarked on small and simple datasets as they are not powerful enough to generate more complex data with a greater number of classes. We notice that in VAE-based generative replay, this could be attributed to the fact that the generated features are far from the original ones when mapped to the latent space. Therefore, we propose three modifications that allow the model to learn and generate complex data. More specifically, we incorporate the distillation in latent space between the current and previous models to reduce feature drift. Additionally, a latent matching for the reconstruction and original data is proposed to improve generated features alignment. Further, based on the observation that the reconstructions are better for preserving knowledge, we add the cycling of generations through the previously trained model to make them closer to the original data. Our method outperforms other generative replay methods in various scenarios. Code available at https://github.com/valeriya-khan/looking-through-the-past.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2021

Multiband VAE: Latent Space Partitioning for Knowledge Consolidation in Continual Learning

We propose a new method for unsupervised continual knowledge consolidati...
research
12/21/2018

Generative Models from the perspective of Continual Learning

Which generative model is the most suitable for Continual Learning? This...
research
03/22/2023

Encoding Binary Concepts in the Latent Space of Generative Models for Enhancing Data Representation

Binary concepts are empirically used by humans to generalize efficiently...
research
04/12/2022

Generative Negative Replay for Continual Learning

Learning continually is a key aspect of intelligence and a necessary abi...
research
03/28/2023

Projected Latent Distillation for Data-Agnostic Consolidation in Distributed Continual Learning

Distributed learning on the edge often comprises self-centered devices (...
research
05/29/2023

SHARP: Sparsity and Hidden Activation RePlay for Neuro-Inspired Continual Learning

Deep neural networks (DNNs) struggle to learn in dynamic environments si...
research
06/14/2022

Learning towards Synchronous Network Memorizability and Generalizability for Continual Segmentation across Multiple Sites

In clinical practice, a segmentation network is often required to contin...

Please sign up or login with your details

Forgot password? Click here to reset