Energy-efficient Knowledge Distillation for Spiking Neural Networks

06/14/2021
by   Dongjin Lee, et al.
0

Spiking neural networks (SNNs) have been gaining interest as energy-efficient alternatives of conventional artificial neural networks (ANNs) due to their event-driven computation. Considering the future deployment of SNN models to constrained neuromorphic devices, many studies have applied techniques originally used for ANN model compression, such as network quantization, pruning, and knowledge distillation, to SNNs. Among them, existing works on knowledge distillation reported accuracy improvements of student SNN model. However, analysis on energy efficiency, which is also an important feature of SNN, was absent. In this paper, we thoroughly analyze the performance of the distilled SNN model in terms of accuracy and energy efficiency. In the process, we observe a substantial increase in the number of spikes, leading to energy inefficiency, when using the conventional knowledge distillation methods. Based on this analysis, to achieve energy efficiency, we propose a novel knowledge distillation method with heterogeneous temperature parameters. We evaluate our method on two different datasets and show that the resulting SNN student satisfies both accuracy improvement and reduction of the number of spikes. On MNIST dataset, our proposed student SNN achieves up to 0.09 and produces 65 conventional knowledge distillation method. We also compare the results with other SNN compression techniques and training methods.

READ FULL TEXT

page 2

page 3

page 8

page 9

page 10

page 11

page 13

page 14

research
05/01/2020

Distilling Spikes: Knowledge Distillation in Spiking Neural Networks

Spiking Neural Networks (SNN) are energy-efficient computing architectur...
research
04/12/2023

Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation

Spiking neural networks (SNNs) are well known as the brain-inspired mode...
research
12/17/2020

Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup

Knowledge distillation, which involves extracting the "dark knowledge" f...
research
08/29/2023

SpikeBERT: A Language Spikformer Trained with Two-Stage Knowledge Distillation from BERT

Spiking neural networks (SNNs) offer a promising avenue to implement dee...
research
04/17/2023

LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks

Spiking Neural Networks (SNNs) are biologically realistic and practicall...
research
10/24/2018

HAKD: Hardware Aware Knowledge Distillation

Despite recent developments, deploying deep neural networks on resource ...
research
02/21/2022

A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation

Network pruning and knowledge distillation are two widely-known model co...

Please sign up or login with your details

Forgot password? Click here to reset