Understanding Continual Learning Settings with Data Distribution Drift Analysis

04/04/2021
by   Timothée Lesort, et al.
18

Classical machine learning algorithms often assume that the data are drawn i.i.d. from a stationary probability distribution. Recently, continual learning emerged as a rapidly growing area of machine learning where this assumption is relaxed, namely, where the data distribution is non-stationary, i.e., changes over time. However, data distribution drifts may interfere with the learning process and erase previously learned knowledge; thus, continual learning algorithms must include specialized mechanisms to deal with such distribution drifts. A distribution drift may change the class labels distribution, the input distribution, or both. Moreover, distribution drifts might be abrupt or gradual. In this paper, we aim to identify and categorize different types of data distribution drifts and potential assumptions about them, to better characterize various continual-learning scenarios. Moreover, we propose to use the distribution drift framework to provide more precise definitions of several terms commonly used in the continual learning field.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/06/2019

Regularization Shortcomings for Continual Learning

In classical machine learning, the data streamed to the algorithms is as...
research
11/24/2020

Energy-Based Models for Continual Learning

We motivate Energy-Based Models (EBMs) as a promising model class for co...
research
02/11/2021

Continuum: Simple Management of Complex Continual Learning Scenarios

Continual learning is a machine learning sub-field specialized in settin...
research
03/02/2022

Continual Feature Selection: Spurious Features in Continual Learning

Continual Learning (CL) is the research field addressing learning settin...
research
08/30/2022

Beyond Supervised Continual Learning: a Review

Continual Learning (CL, sometimes also termed incremental learning) is a...
research
06/03/2021

Continual Learning in Deep Networks: an Analysis of the Last Layer

We study how different output layer types of a deep neural network learn...
research
05/31/2021

A study on the plasticity of neural networks

One aim shared by multiple settings, such as continual learning or trans...

Please sign up or login with your details

Forgot password? Click here to reset