Implicit Filter Sparsification In Convolutional Neural Networks

05/13/2019
by   Dushyant Mehta, et al.
0

We show implicit filter level sparsity manifests in convolutional neural networks (CNNs) which employ Batch Normalization and ReLU activation, and are trained with adaptive gradient descent techniques and L2 regularization or weight decay. Through an extensive empirical study (Mehta et al., 2019) we hypothesize the mechanism behind the sparsification process, and find surprising links to certain filter sparsification heuristics proposed in literature. Emergence of, and the subsequent pruning of selective features is observed to be one of the contributing mechanisms, leading to feature sparsity at par or better than certain explicit sparsification / pruning approaches. In this workshop article we summarize our findings, and point out corollaries of selective-featurepenalization which could also be employed as heuristics for filter pruning

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2018

On Implicit Filter Level Sparsity in Convolutional Neural Networks

We investigate filter level sparsity that emerges in convolutional neura...
research
12/02/2021

Batch Normalization Tells You Which Filter is Important

The goal of filter pruning is to search for unimportant filters to remov...
research
02/18/2018

Efficient Sparse-Winograd Convolutional Neural Networks

Convolutional Neural Networks (CNNs) are computationally intensive, whic...
research
01/08/2019

Spatial-Winograd Pruning Enabling Sparse Winograd Convolution

Deep convolutional neural networks (CNNs) are deployed in various applic...
research
03/15/2022

Interspace Pruning: Using Adaptive Filter Representations to Improve Training of Sparse CNNs

Unstructured pruning is well suited to reduce the memory footprint of co...
research
08/13/2022

Entropy Induced Pruning Framework for Convolutional Neural Networks

Structured pruning techniques have achieved great compression performanc...

Please sign up or login with your details

Forgot password? Click here to reset