PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks

04/07/2021
by   Youngeun Kim, et al.
16

How can we bring both privacy and energy-efficiency to a neural system on edge devices? In this paper, we propose PrivateSNN, which aims to build low-power Spiking Neural Networks (SNNs) from a pre-trained ANN model without leaking sensitive information contained in a dataset. Here, we tackle two types of leakage problems: 1) Data leakage caused when the networks access real training data during an ANN-SNN conversion process. 2) Class leakage is the concept of leakage caused when class-related features can be reconstructed from network parameters. In order to address the data leakage issue, we generate synthetic images from the pre-trained ANNs and convert ANNs to SNNs using generated images. However, converted SNNs are still vulnerable with respect to the class leakage since the weight parameters have the same (or scaled) value with respect to ANN parameters. Therefore, we encrypt SNN weights by training SNNs with a temporal spike-based learning rule. Updating weight parameters with temporal data makes networks difficult to be interpreted in the spatial domain. We observe that the encrypted PrivateSNN can be implemented not only without the huge performance drop (less than  5 energy-efficiency gain (about x60 compared to the standard ANN). We conduct extensive experiments on various datasets including CIFAR10, CIFAR100, and TinyImageNet, highlighting the importance of privacy-preserving SNN training.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

page 7

page 9

page 10

research
06/13/2021

A Free Lunch From ANN: Towards Efficient, Accurate Spiking Neural Networks Calibration

Spiking Neural Network (SNN) has been recognized as one of the next gene...
research
05/03/2023

Joint A-SNN: Joint Training of Artificial and Spiking Neural Networks via Self-Distillation and Weight Factorization

Emerged as a biology-inspired method, Spiking Neural Networks (SNNs) mim...
research
06/20/2022

Examining the Robustness of Spiking Neural Networks on Non-ideal Memristive Crossbars

Spiking Neural Networks (SNNs) have recently emerged as the low-power al...
research
10/14/2021

Beyond Classification: Directly Training Spiking Neural Networks for Semantic Segmentation

Spiking Neural Networks (SNNs) have recently emerged as the low-power al...
research
07/27/2022

Text Classification in Memristor-based Spiking Neural Networks

Memristors, emerging non-volatile memory devices, have shown promising p...
research
03/01/2021

A Little Energy Goes a Long Way: Energy-Efficient, Accurate Conversion from Convolutional Neural Networks to Spiking Neural Networks

Spiking neural networks (SNNs) offer an inherent ability to process spat...
research
10/10/2022

Training Spiking Neural Networks with Local Tandem Learning

Spiking neural networks (SNNs) are shown to be more biologically plausib...

Please sign up or login with your details

Forgot password? Click here to reset