A Gradient Flow Framework For Analyzing Network Pruning

09/24/2020
by   Ekdeep Singh Lubana, et al.
0

Recent network pruning methods focus on pruning models early-on in training. To estimate the impact of removing a parameter, these methods use importance measures that were originally designed to prune trained models. Despite lacking justification for their use early-on in training, such measures result in surprisingly low accuracy loss. To better explain this behavior, we develop a general gradient flow based framework that unifies state-of-the-art importance measures through the norm of model parameters. We use this framework to determine the relationship between pruning measures and evolution of model parameters, establishing several results related to pruning models early-on in training: (i) magnitude-based pruning removes parameters that contribute least to reduction in loss, resulting in models that converge faster than magnitude-agnostic methods; (ii) loss-preservation based pruning preserves first-order model evolution dynamics and is therefore appropriate for pruning minimally trained models; and (iii) gradient-norm based pruning affects second-order model evolution dynamics, such that increasing gradient norm via pruning can produce poorly performing models. We validate our claims on several VGG-13, MobileNet-V1, and ResNet-56 models trained on CIFAR-10 and CIFAR-100. Code available at https://github.com/EkdeepSLubana/flowandprune.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2020

OrthoReg: Robust Network Pruning Using Orthonormality Regularization

Network pruning in Convolutional Neural Networks (CNNs) has been extensi...
research
06/25/2019

Importance Estimation for Neural Network Pruning

Structural pruning of neural network parameters reduces computation, ene...
research
08/09/2022

SBPF: Sensitiveness Based Pruning Framework For Convolutional Neural Network On Image Classification

Pruning techniques are used comprehensively to compress convolutional ne...
research
01/23/2020

Filter Sketch for Network Pruning

In this paper, we propose a novel network pruning approach by informatio...
research
10/28/2021

An Operator Theoretic Perspective on Pruning Deep Neural Networks

The discovery of sparse subnetworks that are able to perform as well as ...
research
06/22/2020

Revisiting Loss Modelling for Unstructured Pruning

By removing parameters from deep neural networks, unstructured pruning m...
research
02/25/2023

A Unified Framework for Soft Threshold Pruning

Soft threshold pruning is among the cutting-edge pruning methods with st...

Please sign up or login with your details

Forgot password? Click here to reset