Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks

08/21/2018
by   Yang He, et al.
0

This paper proposed a Soft Filter Pruning (SFP) method to accelerate the inference procedure of deep Convolutional Neural Networks (CNNs). Specifically, the proposed SFP enables the pruned filters to be updated when training the model after pruning. SFP has two advantages over previous works: (1) Larger model capacity. Updating previously pruned filters provides our approach with larger optimization space than fixing the filters to zero. Therefore, the network trained by our method has a larger model capacity to learn from the training data. (2) Less dependence on the pre-trained model. Large capacity enables SFP to train from scratch and prune the model simultaneously. In contrast, previous filter pruning methods should be conducted on the basis of the pre-trained model to guarantee their performance. Empirically, SFP from scratch outperforms the previous filter pruning methods. Moreover, our approach has been demonstrated effective for many advanced CNN architectures. Notably, on ILSCRC-2012, SFP reduces more than 42 top-5 accuracy improvement, which has advanced the state-of-the-art. Code is publicly available on GitHub: https://github.com/he-y/soft-filter-pruning

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2018

Progressive Deep Neural Networks Acceleration via Soft Filter Pruning

This paper proposed a Progressive Soft Filter Pruning method (PSFP) to p...
research
03/15/2022

Interspace Pruning: Using Adaptive Filter Representations to Improve Training of Sparse CNNs

Unstructured pruning is well suited to reduce the memory footprint of co...
research
07/16/2020

Training Interpretable Convolutional Neural Networks by Differentiating Class-specific Filters

Convolutional neural networks (CNNs) have been successfully used in a ra...
research
07/01/2023

Filter Pruning for Efficient CNNs via Knowledge-driven Differential Filter Sampler

Filter pruning simultaneously accelerates the computation and reduces th...
research
10/09/2021

Weight Evolution: Improving Deep Neural Networks Training through Evolving Inferior Weight Values

To obtain good performance, convolutional neural networks are usually ov...
research
03/02/2023

Practical Network Acceleration with Tiny Sets: Hypothesis, Theory, and Algorithm

Due to data privacy issues, accelerating networks with tiny training set...
research
11/06/2020

GHFP: Gradually Hard Filter Pruning

Filter pruning is widely used to reduce the computation of deep learning...

Please sign up or login with your details

Forgot password? Click here to reset