Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression

09/07/2021
by   Canwen Xu, et al.
0

Recent studies on compression of pretrained language models (e.g., BERT) usually use preserved accuracy as the metric for evaluation. In this paper, we propose two new metrics, label loyalty and probability loyalty that measure how closely a compressed model (i.e., student) mimics the original model (i.e., teacher). We also explore the effect of compression with regard to robustness under adversarial attacks. We benchmark quantization, pruning, knowledge distillation and progressive module replacing with loyalty and robustness. By combining multiple compression techniques, we provide a practical strategy to achieve better accuracy, loyalty and robustness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2020

BERT-of-Theseus: Compressing BERT by Progressive Module Replacing

In this paper, we propose a novel model compression approach to effectiv...
research
10/16/2021

What do Compressed Large Language Models Forget? Robustness Challenges in Model Compression

Recent works have focused on compressing pre-trained language models (PL...
research
03/21/2021

ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques

Pre-trained language models of the BERT family have defined the state-of...
research
08/16/2023

Benchmarking Adversarial Robustness of Compressed Deep Learning Models

The increasing size of Deep Neural Networks (DNNs) poses a pressing need...
research
06/04/2021

ERNIE-Tiny : A Progressive Distillation Framework for Pretrained Transformer Compression

Pretrained language models (PLMs) such as BERT adopt a training paradigm...
research
03/30/2023

oBERTa: Improving Sparse Transfer Learning via improved initialization, distillation, and pruning regimes

In this paper, we introduce the range of oBERTa language models, an easy...
research
05/24/2023

PruMUX: Augmenting Data Multiplexing with Model Compression

As language models increase in size by the day, methods for efficient in...

Please sign up or login with your details

Forgot password? Click here to reset