Online Learned Continual Compression with Stacked Quantization Module

11/19/2019
by   Lucas Caccia, et al.
12

We introduce and study the problem of Online Continual Compression, where one attempts to learn to compress and store a representative dataset from a non i.i.d data stream, while only observing each sample once. This problem is highly relevant for downstream online continual learning tasks, as well as standard learning methods under resource constrained data collection. To address this we propose a new architecture which Stacks Quantization Modules (SQM), consisting of a series of discrete autoencoders, each equipped with their own memory. Every added module is trained to reconstruct the latent space of the previous module using fewer bits, allowing the learned representation to become more compact as training progresses. This modularity has several advantages: 1) moderate compressions are quickly available early in training, which is crucial for remembering the early tasks, 2) as more data needs to be stored, earlier data becomes more compressed, freeing memory, 3) unlike previous methods, our approach does not require pretraining, even on challenging datasets. We show several potential applications of this method. We first replace the episodic memory used in Experience Replay with SQM, leading to significant gains on standard continual learning benchmarks using a fixed memory budget. We then apply our method to online compression of larger images like those from Imagenet, and show that it is also effective with other modalities, such as LiDAR data.

READ FULL TEXT
research
05/18/2021

ACAE-REMIND for Online Continual Learning with Compressed Feature Replay

Online continual learning aims to learn from a non-IID stream of data fr...
research
03/08/2022

New Insights on Reducing Abrupt Representation Change in Online Continual Learning

In the online continual learning paradigm, agents must learn from a chan...
research
06/26/2020

Continual Learning from the Perspective of Compression

Connectionist models such as neural networks suffer from catastrophic fo...
research
02/14/2022

Memory Replay with Data Compression for Continual Learning

Continual learning needs to overcome catastrophic forgetting of the past...
research
07/04/2022

It's all About Consistency: A Study on Memory Composition for Replay-Based Methods in Continual Learning

Continual Learning methods strive to mitigate Catastrophic Forgetting (C...
research
10/20/2021

A TinyML Platform for On-Device Continual Learning with Quantized Latent Replays

In the last few years, research and development on Deep Learning models ...
research
11/03/2021

One Pass ImageNet

We present the One Pass ImageNet (OPIN) problem, which aims to study the...

Please sign up or login with your details

Forgot password? Click here to reset