Projected Latent Distillation for Data-Agnostic Consolidation in Distributed Continual Learning

by   Antonio Carta, et al.

Distributed learning on the edge often comprises self-centered devices (SCD) which learn local tasks independently and are unwilling to contribute to the performance of other SDCs. How do we achieve forward transfer at zero cost for the single SCDs? We formalize this problem as a Distributed Continual Learning scenario, where SCD adapt to local tasks and a CL model consolidates the knowledge from the resulting stream of models without looking at the SCD's private data. Unfortunately, current CL methods are not directly applicable to this scenario. We propose Data-Agnostic Consolidation (DAC), a novel double knowledge distillation method that consolidates the stream of SC models without using the original data. DAC performs distillation in the latent space via a novel Projected Latent Distillation loss. Experimental results show that DAC enables forward transfer between SCDs and reaches state-of-the-art accuracy on Split CIFAR100, CORe50 and Split TinyImageNet, both in reharsal-free and distributed CL scenarios. Somewhat surprisingly, even a single out-of-distribution image is sufficient as the only source of data during consolidation.


page 7

page 12

page 13


Split-and-Bridge: Adaptable Class Incremental Learning within a Single Neural Network

Continual learning has been a major problem in the deep learning communi...

A Closer Look at Rehearsal-Free Continual Learning

Continual learning describes a setting where machine learning models lea...

Rethinking Momentum Knowledge Distillation in Online Continual Learning

Online Continual Learning (OCL) addresses the problem of training neural...

Looking through the past: better knowledge retention for generative replay in continual learning

In this work, we improve the generative replay in a continual learning s...

Continual Learning with Dirichlet Generative-based Rehearsal

Recent advancements in data-driven task-oriented dialogue systems (ToDs)...

Online Lifelong Generalized Zero-Shot Learning

Methods proposed in the literature for zero-shot learning (ZSL) are typi...

Heterogeneous Continual Learning

We propose a novel framework and a solution to tackle the continual lear...

Please sign up or login with your details

Forgot password? Click here to reset