Automatic Neural Network Pruning that Efficiently Preserves the Model Accuracy

11/18/2021
by   Thibault Castells, et al.
0

Neural networks performance has been significantly improved in the last few years, at the cost of an increasing number of floating point operations per second (FLOPs). However, more FLOPs can be an issue when computational resources are limited. As an attempt to solve this problem, pruning filters is a common solution, but most existing pruning methods do not preserve the model accuracy efficiently and therefore require a large number of finetuning epochs. In this paper, we propose an automatic pruning method that learns which neurons to preserve in order to maintain the model accuracy while reducing the FLOPs to a predefined target. To accomplish this task, we introduce a trainable bottleneck that only requires one single epoch with 25.6 (ILSVRC2012) of the dataset to learn which filters to prune. Experiments on various architectures and datasets show that the proposed method can not only preserve the accuracy after pruning but also outperform existing methods after finetuning. We achieve a 52.00 accuracy of 47.51 76.63 for review).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2019

A One-step Pruning-recovery Framework for Acceleration of Convolutional Neural Networks

Acceleration of convolutional neural network has received increasing att...
research
10/17/2018

Pruning Deep Neural Networks using Partial Least Squares

To handle the high computational cost in deep convolutional networks, re...
research
01/16/2017

The Incredible Shrinking Neural Network: New Perspectives on Learning Representations Through The Lens of Pruning

How much can pruning algorithms teach us about the fundamentals of learn...
research
10/29/2022

A pruning method based on the dissimilarity of angle among channels and filters

Convolutional Neural Network (CNN) is more and more widely used in vario...
research
09/15/2022

Neural Networks Reduction via Lumping

The increasing size of recently proposed Neural Networks makes it hard t...
research
06/11/2019

Simultaneously Learning Architectures and Features of Deep Neural Networks

This paper presents a novel method which simultaneously learns the numbe...
research
11/10/2018

Using NonBacktracking Expansion to Analyze k-core Pruning Process

We induce the NonBacktracking Expansion Branch method to analyze the k-c...

Please sign up or login with your details

Forgot password? Click here to reset