Second Rethinking of Network Pruning in the Adversarial Setting

03/29/2019
by   Shaokai Ye, et al.
6

It is well known that deep neural networks (DNNs) are vulnerable to adversarial attacks, which are implemented by adding crafted perturbations onto benign examples. Min-max robust optimization based adversarial training can provide a notion of security against adversarial attacks. However, adversarial robustness requires a significantly larger capacity of the network than that for the natural training with only benign examples. This paper proposes a framework of concurrent adversarial training and weight pruning that enables model compression while still preserving the adversarial robustness and essentially tackles the dilemma of adversarial training. Furthermore, this work studies two hypotheses about weight pruning in the conventional network pruning setting and finds that weight pruning is essential for reducing the network model size in the adversarial setting, i.e., training a small model from scratch even with inherited initialization from the large model cannot achieve both adversarial robustness and model compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2022

Finding Dynamics Preserving Adversarial Winning Tickets

Modern deep neural networks (DNNs) are vulnerable to adversarial attacks...
research
08/16/2023

Benchmarking Adversarial Robustness of Compressed Deep Learning Models

The increasing size of Deep Neural Networks (DNNs) poses a pressing need...
research
06/13/2022

Distributed Adversarial Training to Robustify Deep Neural Networks at Scale

Current deep neural networks (DNNs) are vulnerable to adversarial attack...
research
02/03/2022

Robust Binary Models by Pruning Randomly-initialized Networks

We propose ways to obtain robust models against adversarial attacks from...
research
08/10/2021

On the Effect of Pruning on Adversarial Robustness

Pruning is a well-known mechanism for reducing the computational cost of...
research
09/11/2020

Achieving Adversarial Robustness via Sparsity

Network pruning has been known to produce compact models without much ac...
research
11/03/2020

A Tunable Robust Pruning Framework Through Dynamic Network Rewiring of DNNs

This paper presents a dynamic network rewiring (DNR) method to generate ...

Please sign up or login with your details

Forgot password? Click here to reset