Physical Adversarial Attack on Vehicle Detector in the Carla Simulator

by   Tong Wu, et al.

In this paper, we tackle the issue of physical adversarial examples for object detectors in the wild. Specifically, we proposed to generate adversarial patterns to be applied on vehicle surface so that it's not recognizable by detectors in the photo-realistic Carla simulator. Our approach contains two main techniques, an Enlarge-and-Repeat process and a Discrete Searching method, to craft mosaic-like adversarial vehicle textures without access to neither the model weight of the detector nor a differential rendering procedure. The experimental results demonstrate the effectiveness of our approach in the simulator.


page 3

page 6

page 7


FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Physical adversarial attacks in object detection have attracted increasi...

Note on Attacking Object Detectors with Adversarial Stickers

Deep learning has proven to be a powerful tool for computer vision and h...

Adversarial Examples that Fool Detectors

An adversarial example is an example that has been adjusted to produce a...

Standard detectors aren't (currently) fooled by physical adversarial stop signs

An adversarial example is an example that has been adjusted to produce t...

Practical Adversarial Attack Against Object Detector

In this paper, we proposed the first practical adversarial attacks again...

ACTIVE: Towards Highly Transferable 3D Physical Camouflage for Universal and Robust Vehicle Evasion

Adversarial camouflage has garnered attention for its ability to attack ...

Dynamical issues in interactive representation of physical objects

The quality of a simulator equipped with a haptic interface is given by ...

Please sign up or login with your details

Forgot password? Click here to reset