BioGrad: Biologically Plausible Gradient-Based Learning for Spiking Neural Networks

10/27/2021
by   Guangzhi Tang, et al.
0

Spiking neural networks (SNN) are delivering energy-efficient, massively parallel, and low-latency solutions to AI problems, facilitated by the emerging neuromorphic chips. To harness these computational benefits, SNN need to be trained by learning algorithms that adhere to brain-inspired neuromorphic principles, namely event-based, local, and online computations. Yet, the state-of-the-art SNN training algorithms are based on backprop that does not follow the above principles. Due to its limited biological plausibility, the application of backprop to SNN requires non-local feedback pathways for transmitting continuous-valued errors, and relies on gradients from future timesteps. The introduction of biologically plausible modifications to backprop has helped overcome several of its limitations, but limits the degree to which backprop is approximated, which hinders its performance. We propose a biologically plausible gradient-based learning algorithm for SNN that is functionally equivalent to backprop, while adhering to all three neuromorphic principles. We introduced multi-compartment spiking neurons with local eligibility traces to compute the gradients required for learning, and a periodic "sleep" phase to further improve the approximation to backprop during which a local Hebbian rule aligns the feedback and feedforward weights. Our method achieved the same level of performance as backprop with multi-layer fully connected SNN on MNIST (98.13 datasets. We deployed our learning algorithm on Intel's Loihi to train a 1-hidden-layer network for MNIST, and obtained 93.32 consuming 400 times less energy per training sample than BioGrad on GPU. Our work shows that optimal learning is feasible in neuromorphic computing, and further pursuing its biological plausibility can better capture the benefits of this emerging computing paradigm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2020

Brain-Inspired Learning on Neuromorphic Substrates

Neuromorphic hardware strives to emulate brain-like neural networks and ...
research
10/26/2021

Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics Organized by Astrocyte-modulated Plasticity

The liquid state machine (LSM) combines low training complexity and biol...
research
02/24/2023

Flexible Phase Dynamics for Bio-Plausible Contrastive Learning

Many learning algorithms used as normative models in neuroscience or as ...
research
10/21/2022

Biologically Plausible Variational Policy Gradient with Spiking Recurrent Winner-Take-All Networks

One stream of reinforcement learning research is exploring biologically ...
research
12/29/2022

Biologically Plausible Learning on Neuromorphic Hardware Architectures

With an ever-growing number of parameters defining increasingly complex ...
research
05/05/2020

Towards On-Chip Bayesian Neuromorphic Learning

If edge devices are to be deployed to critical applications where their ...
research
11/30/2021

2D-Motion Detection using SNNs with Graphene-Insulator-Graphene Memristive Synapses

The event-driven nature of spiking neural networks makes them biological...

Please sign up or login with your details

Forgot password? Click here to reset