Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression

03/19/2020
by   Yawei Li, et al.
ETH Zurich
18

In this paper, we analyze two popular network compression techniques, i.e. filter pruning and low-rank decomposition, in a unified sense. By simply changing the way the sparsity regularization is enforced, filter pruning and low-rank decomposition can be derived accordingly. This provides another flexible choice for network compression because the techniques complement each other. For example, in popular network architectures with shortcut connections (e.g. ResNet), filter pruning cannot deal with the last convolutional layer in a ResBlock while the low-rank decomposition methods can. In addition, we propose to compress the whole network jointly instead of in a layer-wise manner. Our approach proves its potential as it compares favorably to the state-of-the-art on several benchmarks.

READ FULL TEXT

page 13

page 14

11/02/2021

Low-Rank+Sparse Tensor Compression for Neural Networks

Low-rank tensor compression has been proposed as a promising approach to...
04/30/2020

TRP: Trained Rank Pruning for Efficient Deep Neural Networks

To enable DNNs on edge devices like mobile phones, low-rank approximatio...
10/09/2019

Traned Rank Pruning for Efficient Deep Neural Networks

To accelerate DNNs inference, low-rank approximation has been widely ado...
03/12/2019

Cascaded Projection: End-to-End Network Compression and Acceleration

We propose a data-driven approach for deep convolutional neural network ...
03/25/2022

Vision Transformer Compression with Structured Pruning and Low Rank Approximation

Transformer architecture has gained popularity due to its ability to sca...
08/13/2022

Entropy Induced Pruning Framework for Convolutional Neural Networks

Structured pruning techniques have achieved great compression performanc...
05/24/2021

Towards Compact CNNs via Collaborative Compression

Channel pruning and tensor decomposition have received extensive attenti...

Code Repositories

group_sparsity

Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.


view repo

Please sign up or login with your details

Forgot password? Click here to reset