Spikeformer: A Novel Architecture for Training High-Performance Low-Latency Spiking Neural Network

11/19/2022
by   Yudong Li, et al.
0

Spiking neural networks (SNNs) have made great progress on both performance and efficiency over the last few years,but their unique working pattern makes it hard to train a high-performance low-latency SNN.Thus the development of SNNs still lags behind traditional artificial neural networks (ANNs).To compensate this gap,many extraordinary works have been proposed.Nevertheless,these works are mainly based on the same kind of network structure (i.e.CNN) and their performance is worse than their ANN counterparts,which limits the applications of SNNs.To this end,we propose a novel Transformer-based SNN,termed "Spikeformer",which outperforms its ANN counterpart on both static dataset and neuromorphic dataset and may be an alternative architecture to CNN for training high-performance SNNs.First,to deal with the problem of "data hungry" and the unstable training period exhibited in the vanilla model,we design the Convolutional Tokenizer (CT) module,which improves the accuracy of the original model on DVS-Gesture by more than 16 Transformer and the spatio-temporal information inherent to SNN,we adopt spatio-temporal attention (STA) instead of spatial-wise or temporal-wise attention.With our proposed method,we achieve competitive or state-of-the-art (SOTA) SNN performance on DVS-CIFAR10,DVS-Gesture,and ImageNet datasets with the least simulation time steps (i.e.low latency).Remarkably,our Spikeformer outperforms other SNNs on ImageNet by a large margin (i.e.more than 5 even outperforms its ANN counterpart by 3.1 ImageNet respectively,indicating that Spikeformer is a promising architecture for training large-scale SNNs and may be more suitable for SNNs compared to CNN.We believe that this work shall keep the development of SNNs in step with ANNs as much as possible.Code will be available.

READ FULL TEXT
research
10/29/2020

Going Deeper With Directly-Trained Larger Spiking Neural Networks

Spiking neural networks (SNNs) are promising in a bio-plausible coding f...
research
03/18/2022

Ultra-low Latency Spiking Neural Networks with Spatio-Temporal Compression and Synaptic Convolutional Block

Spiking neural networks (SNNs), as one of the brain-inspired models, has...
research
09/28/2022

Attention Spiking Neural Networks

Benefiting from the event-driven and sparse spiking characteristics of t...
research
06/08/2017

Spatio-Temporal Backpropagation for Training High-performance Spiking Neural Networks

Compared with artificial neural networks (ANNs), spiking neural networks...
research
05/23/2023

Temporal Contrastive Learning for Spiking Neural Networks

Biologically inspired spiking neural networks (SNNs) have garnered consi...
research
03/24/2023

A Hybrid ANN-SNN Architecture for Low-Power and Low-Latency Visual Perception

Spiking Neural Networks (SNN) are a class of bio-inspired neural network...
research
03/02/2022

Rethinking Pretraining as a Bridge from ANNs to SNNs

Spiking neural networks (SNNs) are known as a typical kind of brain-insp...

Please sign up or login with your details

Forgot password? Click here to reset