A neuromorphic boost to RNNs using low pass filters
The increasing difficulty with Moore's law scaling and the remarkable success of machine learning have triggered a renaissance in the study of low-latency, energy-efficient accelerators for machine learning applications. In particular, spiking neural networks (SNNs) and their neuromorphic hardware implementations have started to receive substantial attention from academia and industry. However, SNNs perform relatively poorly compared to their rate-based counterparts, in terms of accuracy in pattern recognition tasks. In this paper, we present a low pass recurrent neural network (lpRNN) cell that can be trained using backpropagation and implemented on neuromorphic SNN devices. The ability to implement our model on neuromorphic hardware enables the construction of compact devices that can perform always-on processing in ultra-low power edge computing applications. We further argue that the low pass filter is a temporal regularizer and highlight its advantage in a Long Short-Term Memory (LSTM) cell. We show that low pass RNNs are able to learn and generalize better than their unfiltered variants on two long memory synthetic tasks, a character-level text modeling task, and a neuromorphic spoken command detection system.
READ FULL TEXT