Stealthy Low-frequency Backdoor Attack against Deep Neural Networks

05/10/2023
by   Xinrui Liu, et al.
0

Deep neural networks (DNNs) have gain its popularity in various scenarios in recent years. However, its excellent ability of fitting complex functions also makes it vulnerable to backdoor attacks. Specifically, a backdoor can remain hidden indefinitely until activated by a sample with a specific trigger, which is hugely concealed. Nevertheless, existing backdoor attacks operate backdoors in spatial domain, i.e., the poisoned images are generated by adding additional perturbations to the original images, which are easy to detect. To bring the potential of backdoor attacks into full play, we propose low-pass attack, a novel attack scheme that utilizes low-pass filter to inject backdoor in frequency domain. Unlike traditional poisoned image generation methods, our approach reduces high-frequency components and preserve original images' semantic information instead of adding additional perturbations, improving the capability of evading current defenses. Besides, we introduce "precision mode" to make our backdoor triggered at a specified level of filtering, which further improves stealthiness. We evaluate our low-pass attack on four datasets and demonstrate that even under pollution rate of 0.01, we can perform stealthy attack without trading off attack performance. Besides, our backdoor attack can successfully bypass state-of-the-art defending mechanisms. We also compare our attack with existing backdoor attacks and show that our poisoned images are nearly invisible and retain higher image quality.

READ FULL TEXT

page 1

page 2

page 6

research
05/10/2023

Towards Invisible Backdoor Attacks in the Frequency Domain against Deep Neural Networks

Deep neural networks (DNNs) have made tremendous progress in the past te...
research
11/30/2021

Human Imperceptible Attacks and Applications to Improve Fairness

Modern neural networks are able to perform at least as well as humans in...
research
11/22/2021

Backdoor Attack through Frequency Domain

Backdoor attacks have been shown to be a serious threat against deep lea...
research
07/03/2023

A Dual Stealthy Backdoor: From Both Spatial and Frequency Perspectives

Backdoor attacks pose serious security threats to deep neural networks (...
research
03/09/2022

Practical No-box Adversarial Attacks with Training-free Hybrid Image Transformation

In recent years, the adversarial vulnerability of deep neural networks (...
research
06/08/2021

Handcrafted Backdoors in Deep Neural Networks

Deep neural networks (DNNs), while accurate, are expensive to train. Man...
research
08/21/2023

Temporal-Distributed Backdoor Attack Against Video Based Action Recognition

Deep neural networks (DNNs) have achieved tremendous success in various ...

Please sign up or login with your details

Forgot password? Click here to reset