Skip-Connected Self-Recurrent Spiking Neural Networks with Joint Intrinsic Parameter and Synaptic Weight Training

10/23/2020
by   Wenrui Zhang, et al.
0

As an important class of spiking neural networks (SNNs), recurrent spiking neural networks (RSNNs) possess great computational power and have been widely used for processing sequential data like audio and text. However, most RSNNs suffer from two problems. 1. Due to a lack of architectural guidance, random recurrent connectivity is often adopted, which does not guarantee good performance. 2. Training of RSNNs is in general challenging, bottlenecking achievable model accuracy. To address these problems, we propose a new type of RSNNs called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs). Recurrence in ScSr-SNNs is introduced in a stereotyped manner by adding self-recurrent connections to spiking neurons, which implements local memory. The network dynamics is enriched by skip connections between nonadjacent layers. Constructed by simplified self-recurrent and skip connections, ScSr-SNNs are able to realize recurrent behaviors similar to those of more complex RSNNs while the error gradients can be more straightforwardly calculated due to the mostly feedforward nature of the network. Moreover, we propose a new backpropagation (BP) method called backpropagated intrinsic plasticity (BIP) to further boost the performance of ScSr-SNNs by training intrinsic model parameters. Unlike standard intrinsic plasticity rules that adjust the neuron's intrinsic parameters according to neuronal activity, the proposed BIP methods optimize intrinsic parameters based on the backpropagated error gradient of a well-defined global loss function in addition to synaptic weight training. Based upon challenging speech and neuromorphic speech datasets including TI46-Alpha, TI46-Digits, and N-TIDIGITS, the proposed ScSr-SNNs can boost performance by up to 2.55 state-of-the-art BP methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2021

Composing Recurrent Spiking Neural Networks using Locally-Recurrent Motifs and Risk-Mitigating Architectural Optimization

In neural circuits, recurrent connectivity plays a crucial role in netwo...
research
06/22/2021

Backpropagated Neighborhood Aggregation for Accurate Training of Spiking Neural Networks

While backpropagation (BP) has been applied to spiking neural networks (...
research
11/14/2021

BioLeaF: A Bio-plausible Learning Framework for Training of Spiking Neural Networks

Our brain consists of biological neurons encoding information through ac...
research
03/07/2004

Memorization in a neural network with adjustable transfer function and conditional gating

The main problem about replacing LTP as a memory mechanism has been to f...
research
03/23/2023

Skip Connections in Spiking Neural Networks: An Analysis of Their Effect on Network Training

Spiking neural networks (SNNs) have gained attention as a promising alte...
research
05/28/2023

Evolving Connectivity for Recurrent Spiking Neural Networks

Recurrent spiking neural networks (RSNNs) hold great potential for advan...
research
07/25/2005

Network Topology influences Synchronization and Intrinsic Read-out

What are the effects of neuromodulation on a large network model? Neurom...

Please sign up or login with your details

Forgot password? Click here to reset