Bayesian Gradient Descent: Online Variational Bayes Learning with Increased Robustness to Catastrophic Forgetting and Weight Pruning

03/27/2018
by   Chen Zeno, et al.
0

We suggest a novel approach for the estimation of the posterior distribution of the weights of a neural network, using an online version of the variational Bayes method. Having a confidence measure of the weights allows to combat several shortcomings of neural networks, such as their parameter redundancy, and their notorious vulnerability to the change of input distribution ("catastrophic forgetting"). Specifically, We show that this approach helps alleviate the catastrophic forgetting phenomenon - even without the knowledge of when the tasks are been switched. Furthermore, it improves the robustness of the network to weight pruning - even without re-training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2018

Practical Bayesian Learning of Neural Networks via Adaptive Subgradient Methods

We introduce a novel framework for the estimation of the posterior distr...
research
05/04/2012

Weighted Patterns as a Tool for Improving the Hopfield Model

We generalize the standard Hopfield model to the case when a weight is a...
research
05/20/2018

Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting

We introduce the Kronecker factored online Laplace approximation for ove...
research
07/31/2023

Lookbehind Optimizer: k steps back, 1 step forward

The Lookahead optimizer improves the training stability of deep neural n...
research
06/21/2021

Iterative Network Pruning with Uncertainty Regularization for Lifelong Sentiment Classification

Lifelong learning capabilities are crucial for sentiment classifiers to ...
research
02/08/2019

Stimulating STDP to Exploit Locality for Lifelong Learning without Catastrophic Forgetting

Stochastic gradient descent requires that training samples be drawn from...
research
07/22/2022

Revisiting Parameter Reuse to Overcome Catastrophic Forgetting in Neural Networks

Neural networks tend to forget previously learned knowledge when continu...

Please sign up or login with your details

Forgot password? Click here to reset