Facilitating Bayesian Continual Learning by Natural Gradients and Stein Gradients

04/24/2019
by   Yu Chen, et al.
0

Continual learning aims to enable machine learning models to learn a general solution space for past and future tasks in a sequential manner. Conventional models tend to forget the knowledge of previous tasks while learning a new task, a phenomenon known as catastrophic forgetting. When using Bayesian models in continual learning, knowledge from previous tasks can be retained in two ways: 1). posterior distributions over the parameters, containing the knowledge gained from inference in previous tasks, which then serve as the priors for the following task; 2). coresets, containing knowledge of data distributions of previous tasks. Here, we show that Bayesian continual learning can be facilitated in terms of these two means through the use of natural gradients and Stein gradients respectively.

READ FULL TEXT
research
10/09/2019

Continual Learning Using Bayesian Neural Networks

Continual learning models allow to learn and adapt to new changes and ta...
research
05/25/2023

SketchOGD: Memory-Efficient Continual Learning

When machine learning models are trained continually on a sequence of ta...
research
12/08/2019

Nonparametric Bayesian Structure Adaptation for Continual Learning

Continual Learning is a learning paradigm where machine learning models ...
research
05/17/2021

Layerwise Optimization by Gradient Decomposition for Continual Learning

Deep neural networks achieve state-of-the-art and sometimes super-human ...
research
03/12/2019

Continual Learning in Practice

This paper describes a reference architecture for self-maintaining syste...
research
12/22/2021

Continual learning of longitudinal health records

Continual learning denotes machine learning methods which can adapt to n...
research
06/15/2021

Natural continual learning: success is a journey, not (just) a destination

Biological agents are known to learn many different tasks over the cours...

Please sign up or login with your details

Forgot password? Click here to reset