Movement Pruning: Adaptive Sparsity by Fine-Tuning

05/15/2020
by   Victor Sanh, et al.
0

Magnitude pruning is a widely used strategy for reducing model size in pure supervised learning; however, it is less effective in the transfer learning regime that has become standard for state-of-the-art natural language processing applications. We propose the use of movement pruning, a simple, deterministic first-order weight pruning method that is more adaptive to pretrained model fine-tuning. We give mathematical foundations to the method and compare it to existing zeroth- and first-order pruning methods. Experiments show that when pruning large pretrained language models, movement pruning shows significant improvements in high-sparsity regimes. When combined with distillation, the approach achieves minimal accuracy loss with down to only 3 of the model parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2022

Pruning Pre-trained Language Models Without Fine-Tuning

To overcome the overparameterized problem in Pre-trained Language Models...
research
02/28/2023

Fast as CHITA: Neural Network Pruning with Combinatorial Optimization

The sheer size of modern neural networks makes model serving a serious c...
research
09/10/2021

Block Pruning For Faster Transformers

Pre-training has improved model accuracy for both classification and gen...
research
04/25/2023

Towards Compute-Optimal Transfer Learning

The field of transfer learning is undergoing a significant shift with th...
research
10/10/2019

Structured Pruning of Large Language Models

Large language models have recently achieved state of the art performanc...
research
06/20/2023

A Simple and Effective Pruning Approach for Large Language Models

As their size increases, Large Languages Models (LLMs) are natural candi...
research
12/14/2020

Parameter-Efficient Transfer Learning with Diff Pruning

While task-specific finetuning of pretrained networks has led to signifi...

Please sign up or login with your details

Forgot password? Click here to reset