Filter Pruning via Filters Similarity in Consecutive Layers

by   Xiaorui Wang, et al.

Filter pruning is widely adopted to compress and accelerate the Convolutional Neural Networks (CNNs), but most previous works ignore the relationship between filters and channels in different layers. Processing each layer independently fails to utilize the collaborative relationship across layers. In this paper, we intuitively propose a novel pruning method by explicitly leveraging the Filters Similarity in Consecutive Layers (FSCL). FSCL compresses models by pruning filters whose corresponding features are more worthless in the model. The extensive experiments demonstrate the effectiveness of FSCL, and it yields remarkable improvement over state-of-the-art on accuracy, FLOPs and parameter reduction on several benchmark models and datasets.


page 1

page 2

page 3

page 4


DBP: Discrimination Based Block-Level Pruning for Deep Model Acceleration

Neural network pruning is one of the most popular methods of acceleratin...

Dependency Aware Filter Pruning

Convolutional neural networks (CNNs) are typically over-parameterized, b...

Graph Pruning for Model Compression

Previous AutoML pruning works utilized individual layer features to auto...

Learning Instance-wise Sparsity for Accelerating Deep Models

Exploring deep convolutional neural networks of high efficiency and low ...

Self Similarity Matrix based CNN Filter Pruning

In recent years, most of the deep learning solutions are targeted to be ...

Cut Inner Layers: A Structured Pruning Strategy for Efficient U-Net GANs

Pruning effectively compresses overparameterized models. Despite the suc...

Feature Statistics Guided Efficient Filter Pruning

Building compact convolutional neural networks (CNNs) with reliable perf...

Please sign up or login with your details

Forgot password? Click here to reset