Hebbian and Gradient-based Plasticity Enables Robust Memory and Rapid Learning in RNNs

02/07/2023
by   Yu Duan, et al.
0

Rapidly learning from ongoing experiences and remembering past events with a flexible memory system are two core capacities of biological intelligence. While the underlying neural mechanisms are not fully understood, various evidence supports that synaptic plasticity plays a critical role in memory formation and fast learning. Inspired by these results, we equip Recurrent Neural Networks (RNNs) with plasticity rules to enable them to adapt their parameters according to ongoing experiences. In addition to the traditional local Hebbian plasticity, we propose a global, gradient-based plasticity rule, which allows the model to evolve towards its self-determined target. Our models show promising results on sequential and associative memory tasks, illustrating their ability to robustly form and retain memories. In the meantime, these models can cope with many challenging few-shot learning problems. Comparing different plasticity rules under the same framework shows that Hebbian plasticity is well-suited for several memory and associative learning tasks; however, it is outperformed by gradient-based plasticity on few-shot regression tasks which require the model to infer the underlying mapping. Code is available at https://github.com/yuvenduan/PlasticRNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2023

Neuroevolution of Recurrent Architectures on Control Tasks

Modern artificial intelligence works typically train the parameters of f...
research
06/12/2020

MomentumRNN: Integrating Momentum into Recurrent Neural Networks

Designing deep neural networks is an art that often involves an expensiv...
research
06/25/2020

Global Convergence and Induced Kernels of Gradient-Based Meta-Learning with Neural Nets

Gradient-based meta-learning (GBML) with deep neural nets (DNNs) has bec...
research
05/30/2022

Easter2.0: Improving convolutional models for handwritten text recognition

Convolutional Neural Networks (CNN) have shown promising results for the...
research
12/13/2020

A Memory-Augmented Neural Network Model of Abstract Rule Learning

Human intelligence is characterized by a remarkable ability to infer abs...
research
10/01/2022

Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition

This work analyzes the solution trajectory of gradient-based algorithms ...
research
01/03/2021

Recoding latent sentence representations – Dynamic gradient-based activation modification in RNNs

In Recurrent Neural Networks (RNNs), encoding information in a suboptima...

Please sign up or login with your details

Forgot password? Click here to reset