Differentiable plasticity: training plastic neural networks with backpropagation

04/06/2018
by   Thomas Miconi, et al.
0

How can we build agents that keep learning from experience, quickly and efficiently, after their initial training? Here we take inspiration from the main mechanism of learning in biological brains: synaptic plasticity, carefully tuned by evolution to produce efficient lifelong learning. We show that plasticity, just like connection weights, can be optimized by gradient descent in large (millions of parameters) recurrent networks with Hebbian plastic connections. First, recurrent plastic networks with more than two million parameters can be trained to memorize and reconstruct sets of novel, high-dimensional 1000+ pixels natural images not seen during training. Crucially, traditional non-plastic recurrent networks fail to solve this task. Furthermore, trained plastic networks can also solve generic meta-learning tasks such as the Omniglot task, with competitive results and little parameter overhead. Finally, in reinforcement learning settings, plastic networks outperform a non-plastic equivalent in a maze exploration task. We conclude that differentiable plasticity may provide a powerful novel approach to the learning-to-learn problem.

READ FULL TEXT

page 6

page 7

research
02/24/2020

Backpropamine: training self-modifying neural networks with differentiable neuromodulated plasticity

The impressive lifelong learning in animal brains is primarily enabled b...
research
09/08/2016

Learning to learn with backpropagation of Hebbian plasticity

Hebbian plasticity is a powerful principle that allows biological brains...
research
06/04/2021

SpikePropamine: Differentiable Plasticity in Spiking Neural Networks

The adaptive changes in synaptic efficacy that occur between spiking neu...
research
06/07/2019

Non-Differentiable Supervised Learning with Evolution Strategies and Hybrid Methods

In this work we show that Evolution Strategies (ES) are a viable method ...
research
06/20/2016

Neural networks with differentiable structure

While gradient descent has proven highly successful in learning connecti...
research
07/06/2020

Meta-Learning through Hebbian Plasticity in Random Networks

Lifelong learning and adaptability are two defining aspects of biologica...
research
12/16/2021

Learning to acquire novel cognitive tasks with evolution, plasticity and meta-meta-learning

In meta-learning, networks are trained with external algorithms to learn...

Please sign up or login with your details

Forgot password? Click here to reset