Lookahead: A Far-Sighted Alternative of Magnitude-based Pruning

02/12/2020
by   Sejun Park, et al.
12

Magnitude-based pruning is one of the simplest methods for pruning neural networks. Despite its simplicity, magnitude-based pruning and its variants demonstrated remarkable performances for pruning modern architectures. Based on the observation that magnitude-based pruning indeed minimizes the Frobenius distortion of a linear operator corresponding to a single layer, we develop a simple pruning method, coined lookahead pruning, by extending the single layer optimization to a multi-layer optimization. Our experimental results demonstrate that the proposed method consistently outperforms magnitude-based pruning on various networks, including VGG and ResNet, particularly in the high-sparsity regime. See https://github.com/alinlab/lookahead_pruning for codes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2020

A Deeper Look at the Layerwise Sparsity of Magnitude-based Pruning

Recent discoveries on neural network pruning reveal that, with a careful...
research
09/29/2022

Is Complexity Required for Neural Network Pruning? A Case Study on Global Magnitude Pruning

Pruning neural networks has become popular in the last decade when it wa...
research
08/28/2021

Layer-wise Model Pruning based on Mutual Information

The proposed pruning strategy offers merits over weight-based pruning te...
research
12/10/2019

Magnitude and Uncertainty Pruning Criterion for Neural Networks

Neural networks have achieved dramatic improvements in recent years and ...
research
06/20/2023

A Simple and Effective Pruning Approach for Large Language Models

As their size increases, Large Languages Models (LLMs) are natural candi...
research
06/14/2022

Zeroth-Order Topological Insights into Iterative Magnitude Pruning

Modern-day neural networks are famously large, yet also highly redundant...
research
10/28/2021

An Operator Theoretic Perspective on Pruning Deep Neural Networks

The discovery of sparse subnetworks that are able to perform as well as ...

Please sign up or login with your details

Forgot password? Click here to reset