NeuroAttack: Undermining Spiking Neural Networks Security through Externally Triggered Bit-Flips

by   Valerio Venceslai, et al.
Université Polytechnique Hauts-de-France
Politecnico di Torino

Due to their proven efficiency, machine-learning systems are deployed in a wide range of complex real-life problems. More specifically, Spiking Neural Networks (SNNs) emerged as a promising solution to the accuracy, resource-utilization, and energy-efficiency challenges in machine-learning systems. While these systems are going mainstream, they have inherent security and reliability issues. In this paper, we propose NeuroAttack, a cross-layer attack that threatens the SNNs integrity by exploiting low-level reliability issues through a high-level attack. Particularly, we trigger a fault-injection based sneaky hardware backdoor through a carefully crafted adversarial input noise. Our results on Deep Neural Networks (DNNs) and SNNs show a serious integrity threat to state-of-the art machine-learning techniques.


page 6

page 7


Towards Energy-Efficient and Secure Edge AI: A Cross-Layer Framework

The security and privacy concerns along with the amount of data that is ...

Models Developed for Spiking Neural Networks

Emergence of deep neural networks (DNNs) has raised enormous attention t...

enpheeph: A Fault Injection Framework for Spiking and Compressed Deep Neural Networks

Research on Deep Neural Networks (DNNs) has focused on improving perform...

Low Precision Quantization-aware Training in Spiking Neural Networks with Differentiable Quantization Function

Deep neural networks have been proven to be highly effective tools in va...

Analysis of Power-Oriented Fault Injection Attacks on Spiking Neural Networks

Spiking Neural Networks (SNN) are quickly gaining traction as a viable a...

An Optimization Perspective on Realizing Backdoor Injection Attacks on Deep Neural Networks in Hardware

State-of-the-art deep neural networks (DNNs) have been proven to be vuln...

Discrepancies among Pre-trained Deep Neural Networks: A New Threat to Model Zoo Reliability

Training deep neural networks (DNNs) takes signifcant time and resources...

Please sign up or login with your details

Forgot password? Click here to reset