Comment on Transferability and Input Transformation with Additive Noise

06/18/2022
by   Hoki Kim, et al.
0

Adversarial attacks have verified the existence of the vulnerability of neural networks. By adding small perturbations to a benign example, adversarial attacks successfully generate adversarial examples that lead misclassification of deep learning models. More importantly, an adversarial example generated from a specific model can also deceive other models without modification. We call this phenomenon “transferability". Here, we analyze the relationship between transferability and input transformation with additive noise by mathematically proving that the modified optimization can produce more transferable adversarial examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2023

Rethinking Model Ensemble in Transfer-based Adversarial Attacks

Deep learning models are vulnerable to adversarial examples. Transfer-ba...
research
07/04/2020

Relationship between manifold smoothness and adversarial vulnerability in deep learning with local errors

Artificial neural networks can achieve impressive performances, and even...
research
08/27/2021

Disrupting Adversarial Transferability in Deep Neural Networks

Adversarial attack transferability is a well-recognized phenomenon in de...
research
03/18/2022

Concept-based Adversarial Attacks: Tricking Humans and Classifiers Alike

We propose to generate adversarial samples by modifying activations of u...
research
01/16/2017

Vulnerability of Deep Reinforcement Learning to Policy Induction Attacks

Deep learning classifiers are known to be inherently vulnerable to manip...
research
03/08/2023

Exploring Adversarial Attacks on Neural Networks: An Explainable Approach

Deep Learning (DL) is being applied in various domains, especially in sa...
research
05/26/2022

An Analytic Framework for Robust Training of Artificial Neural Networks

The reliability of a learning model is key to the successful deployment ...

Please sign up or login with your details

Forgot password? Click here to reset