Fooling Polarization-based Vision using Locally Controllable Polarizing Projection

by   Zhuoxiao Li, et al.

Polarization is a fundamental property of light that encodes abundant information regarding surface shape, material, illumination and viewing geometry. The computer vision community has witnessed a blossom of polarization-based vision applications, such as reflection removal, shape-from-polarization, transparent object segmentation and color constancy, partially due to the emergence of single-chip mono/color polarization sensors that make polarization data acquisition easier than ever. However, is polarization-based vision vulnerable to adversarial attacks? If so, is that possible to realize these adversarial attacks in the physical world, without being perceived by human eyes? In this paper, we warn the community of the vulnerability of polarization-based vision, which can be more serious than RGB-based vision. By adapting a commercial LCD projector, we achieve locally controllable polarizing projection, which is successfully utilized to fool state-of-the-art polarization-based vision algorithms for glass segmentation and color constancy. Compared with existing physical attacks on RGB-based vision, which always suffer from the trade-off between attack efficacy and eye conceivability, the adversarial attackers based on polarizing projection are contact-free and visually imperceptible, since naked human eyes can rarely perceive the difference of viciously manipulated polarizing light and ordinary illumination. This poses unprecedented risks on polarization-based vision, both in the monochromatic and trichromatic domain, for which due attentions should be paid and counter measures be considered.


page 4

page 6

page 8

page 11

page 13


Adversarial Color Projection: A Projector-Based Physical Attack to DNNs

Recent advances have shown that deep neural networks (DNNs) are suscepti...

RFLA: A Stealthy Reflected Light Adversarial Attack in the Physical World

Physical adversarial attacks against deep neural networks (DNNs) have re...

SPAA: Stealthy Projector-based Adversarial Attacks on Deep Image Classifiers

Light-based adversarial attacks aim to fool deep learning-based image cl...

Why Don't You Clean Your Glasses? Perception Attacks with Dynamic Optical Perturbations

Camera-based autonomous systems that emulate human perception are increa...

On Adversarial Vulnerability of PHM algorithms: An Initial Study

With proliferation of deep learning (DL) applications in diverse domains...

Event-based RGB-D sensing with structured light

Event-based cameras (ECs) are bio-inspired sensors that asynchronously r...

Enhancement of human color vision by breaking the binocular redundancy

To see color, the human visual system combines the responses of three ty...

Please sign up or login with your details

Forgot password? Click here to reset