Imperceptible Physical Attack against Face Recognition Systems via LED Illumination Modulation

07/25/2023
by   Junbin Fang, et al.
0

Although face recognition starts to play an important role in our daily life, we need to pay attention that data-driven face recognition vision systems are vulnerable to adversarial attacks. However, the current two categories of adversarial attacks, namely digital attacks and physical attacks both have drawbacks, with the former ones impractical and the latter one conspicuous, high-computational and inexecutable. To address the issues, we propose a practical, executable, inconspicuous and low computational adversarial attack based on LED illumination modulation. To fool the systems, the proposed attack generates imperceptible luminance changes to human eyes through fast intensity modulation of scene LED illumination and uses the rolling shutter effect of CMOS image sensors in face recognition systems to implant luminance information perturbation to the captured face images. In summary,we present a denial-of-service (DoS) attack for face detection and a dodging attack for face verification. We also evaluate their effectiveness against well-known face detection models, Dlib, MTCNN and RetinaFace , and face verification models, Dlib, FaceNet,and ArcFace.The extensive experiments show that the success rates of DoS attacks against face detection models reach 97.67 respectively, and the success rates of dodging attacks against all face verification models reach 100

READ FULL TEXT

page 1

page 5

page 9

page 10

page 11

research
06/09/2022

ReFace: Real-time Adversarial Attacks on Face Recognition Systems

Deep neural network based face recognition models have been shown to be ...
research
04/09/2019

Efficient Decision-based Black-box Adversarial Attacks on Face Recognition

Face recognition has obtained remarkable progress in recent years due to...
research
09/12/2023

Generalized Attacks on Face Verification Systems

Face verification (FV) using deep neural network models has made tremend...
research
09/01/2023

Impact of Image Context for Single Deep Learning Face Morphing Attack Detection

The increase in security concerns due to technological advancements has ...
research
09/24/2018

Fast Geometrically-Perturbed Adversarial Faces

The state-of-the-art performance of deep learning algorithms has led to ...
research
08/18/2021

Adversarial Relighting against Face Recognition

Deep face recognition (FR) has achieved significantly high accuracy on s...
research
09/29/2022

Digital and Physical Face Attacks: Reviewing and One Step Further

With the rapid progress over the past five years, face authentication ha...

Please sign up or login with your details

Forgot password? Click here to reset