RobustMQ: Benchmarking Robustness of Quantized Models

08/04/2023
by   Yisong Xiao, et al.
0

Quantization has emerged as an essential technique for deploying deep neural networks (DNNs) on devices with limited resources. However, quantized models exhibit vulnerabilities when exposed to various noises in real-world applications. Despite the importance of evaluating the impact of quantization on robustness, existing research on this topic is limited and often disregards established principles of robustness evaluation, resulting in incomplete and inconclusive findings. To address this gap, we thoroughly evaluated the robustness of quantized models against various noises (adversarial attacks, natural corruptions, and systematic noises) on ImageNet. The comprehensive evaluation results empirically provide valuable insights into the robustness of quantized models in various scenarios, for example: (1) quantized models exhibit higher adversarial robustness than their floating-point counterparts, but are more vulnerable to natural corruptions and systematic noises; (2) in general, increasing the quantization bit-width results in a decrease in adversarial robustness, an increase in natural robustness, and an increase in systematic robustness; (3) among corruption methods, impulse noise and glass blur are the most harmful to quantized models, while brightness has the least impact; (4) among systematic noises, the nearest neighbor interpolation has the highest impact, while bilinear interpolation, cubic interpolation, and area interpolation are the three least harmful. Our research contributes to advancing the robust quantization of models and their deployment in real-world scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

Benchmarking the Robustness of Quantized Models

Quantization has emerged as an essential technique for deploying deep ne...
research
10/23/2021

A Layer-wise Adversarial-aware Quantization Optimization for Improving Robustness

Neural networks are getting better accuracy with higher energy and compu...
research
03/23/2023

Benchmarking the Reliability of Post-training Quantization: a Particular Focus on Worst-case Performance

Post-training quantization (PTQ) is a popular method for compressing dee...
research
10/17/2022

ODG-Q: Robust Quantization via Online Domain Generalization

Quantizing neural networks to low-bitwidth is important for model deploy...
research
04/17/2019

Defensive Quantization: When Efficiency Meets Robustness

Neural network quantization is becoming an industry standard to efficien...
research
11/29/2022

Quantization-aware Interval Bound Propagation for Training Certifiably Robust Quantized Neural Networks

We study the problem of training and certifying adversarially robust qua...
research
06/21/2023

A Comprehensive Study on the Robustness of Image Classification and Object Detection in Remote Sensing: Surveying and Benchmarking

Deep neural networks (DNNs) have found widespread applications in interp...

Please sign up or login with your details

Forgot password? Click here to reset