Variational Continual Learning

10/29/2017
by   Cuong V. Nguyen, et al.
0

This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can successfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and entirely new tasks emerge. Experimental results show that variational continual learning outperforms state-of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way.

READ FULL TEXT

page 9

page 16

research
05/06/2019

Improving and Understanding Variational Continual Learning

In the continual learning setting, tasks are encountered sequentially. T...
research
01/04/2023

On Sequential Bayesian Inference for Continual Learning

Sequential Bayesian inference can be used for continual learning to prev...
research
02/20/2022

Efficient Continual Learning Ensembles in Neural Network Subspaces

A growing body of research in continual learning focuses on the catastro...
research
11/24/2020

Generalized Variational Continual Learning

Continual learning deals with training models on new tasks and datasets ...
research
12/04/2019

Indian Buffet Neural Networks for Continual Learning

We place an Indian Buffet Process (IBP) prior over the neural structure ...
research
03/09/2020

FoCL: Feature-Oriented Continual Learning for Generative Models

In this paper, we propose a general framework in continual learning for ...
research
06/12/2020

CPR: Classifier-Projection Regularization for Continual Learning

We propose a general, yet simple patch that can be applied to existing r...

Please sign up or login with your details

Forgot password? Click here to reset