Multi-Compartment Variational Online Learning for Spiking Neural Networks
Spiking Neural Networks (SNNs) offer a novel computational paradigm that captures some of efficiency of biological brains for inference and learning via recursive processing and binary neural activations. Most existing training algorithms for SNNs assume spiking neuron models in which a neuron outputs individual spikes as a function of the dynamics of an internal state variable known as membrane potential. This paper explores a more general model in which each spiking neuron contains multiple compartments, each tracking the dynamics of a distinct membrane potential, while sharing the same synaptic weights across compartments. It is demonstrated that learning rules based on probabilistic generalized linear neural models can leverage the presence of multiple compartments through modern variational inference based on importance weighting or generalized expectation-maximization. The key idea is to use the neural compartments to sample multiple independent spiking signals from hidden neurons so as to obtain better statistical estimates of the likelihood training criterion. The derived online learning algorithms follow three-factor rules with global learning signals. Experimental results on a structured output memorization task and classification task with a standard neuromorphic data set demonstrate significant improvements in terms of accuracy and calibration with an increasing number of compartments.
READ FULL TEXT