Accelerating spiking neural network training

05/30/2022
by   Luke Taylor, et al.
10

Spiking neural networks (SNN) are a type of artificial network inspired by the use of action potentials in the brain. There is a growing interest in emulating these networks on neuromorphic computers due to their improved energy consumption and speed, which are the main scaling issues of their counterpart the artificial neural network (ANN). Significant progress has been made in directly training SNNs to perform on par with ANNs in terms of accuracy. These methods are however slow due to their sequential nature, leading to long training times. We propose a new technique for directly training single-spike-per-neuron SNNs which eliminates all sequential computation and relies exclusively on vectorised operations. We demonstrate over a × 10 speedup in training with robust classification performance on real datasets of low to medium spatio-temporal complexity (Fashion-MNIST and Neuromophic-MNIST). Our proposed solution manages to solve certain tasks with over a 95.68 % reduction in spike counts relative to a conventionally trained SNN, which could significantly reduce energy requirements when deployed on neuromorphic computers.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset