A Unified Framework for Lifelong Learning in Deep Neural Networks

11/21/2019
by   Charles X. Ling, et al.
0

Humans can learn a variety of concepts and skills incrementally over the course of their lives while exhibiting an array of desirable properties, such as non-forgetting, concept rehearsal, forward transfer and backward transfer of knowledge, and so on. Previous approaches to lifelong learning (LLL) have demonstrated subsets of these properties, often with multiple mechanisms. In this paper, we propose a simple yet powerful unified framework that demonstrates all of these desirable properties. Our novel framework utilizes a small number of weight consolidation parameters dynamically applied to groups of weights, reflecting how "stiff" weights can be modified during training in deep neural networks. In addition, we are able to draw many parallels between the behaviours and mechanisms of our model and those surrounding human learning, such as memory loss or sleep deprivation. This allows our approach to serve as a conduit for two-way inspiration to further understand lifelong learning in machines and humans.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset