HRank: Filter Pruning using High-Rank Feature Map

02/24/2020
by   Mingbao Lin, et al.
7

Neural network pruning offers a promising prospect to facilitate deploying deep neural networks on resource-limited devices. However, existing methods are still challenged by the training inefficiency and labor cost in pruning designs, due to missing theoretical guidance of non-salient network components. In this paper, we propose a novel filter pruning method by exploring the High Rank of feature maps (HRank). Our HRank is inspired by the discovery that the average rank of multiple feature maps generated by a single filter is always the same, regardless of the number of image batches CNNs receive. Based on HRank, we develop a method that is mathematically formulated to prune filters with low-rank feature maps. The principle behind our pruning is that low-rank feature maps contain less information, and thus pruned results can be easily reproduced. Besides, we experimentally show that weights with high-rank feature maps contain more important information, such that even when a portion is not updated, very little damage would be done to the model performance. Without introducing any additional constraints, HRank leads to significant improvements over the state-of-the-arts in terms of FLOPs and parameters reduction, with similar accuracies. For example, with ResNet-110, we achieve a 58.2 reduction by removing 59.2 in top-1 accuracy on CIFAR-10. With Res-50, we achieve a 43.8 by removing 36.7 accuracy on ImageNet. The codes can be available at https://github.com/lmbxmu/HRank.

READ FULL TEXT

page 4

page 8

research
12/10/2021

Network Compression via Central Filter

Neural network pruning has remarkable performance for reducing the compl...
research
12/24/2018

Dynamic Runtime Feature Map Pruning

High bandwidth requirements are an obstacle for accelerating the trainin...
research
09/27/2022

Sauron U-Net: Simple automated redundancy elimination in medical image segmentation via filter pruning

We present Sauron, a filter pruning method that eliminates redundant fea...
research
05/28/2020

A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

Network pruning has become the de facto tool to accelerate deep neural n...
research
11/08/2017

SIMILARnet: Simultaneous Intelligent Localization and Recognition Network

Global Average Pooling (GAP) [4] has been used previously to generate cl...
research
05/21/2020

Feature Statistics Guided Efficient Filter Pruning

Building compact convolutional neural networks (CNNs) with reliable perf...
research
01/23/2020

Filter Sketch for Network Pruning

In this paper, we propose a novel network pruning approach by informatio...

Please sign up or login with your details

Forgot password? Click here to reset