Revisiting Random Channel Pruning for Neural Network Compression

05/11/2022
by   Yawei Li, et al.
ETH Zurich
0

Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of neural networks. There has been a flurry of algorithms that try to solve this practical problem, each being claimed effective in some ways. Yet, a benchmark to compare those algorithms directly is lacking, mainly due to the complexity of the algorithms and some custom settings such as the particular network configuration or training procedure. A fair benchmark is important for the further development of channel pruning. Meanwhile, recent investigations reveal that the channel configurations discovered by pruning algorithms are at least as important as the pre-trained weights. This gives channel pruning a new role, namely searching the optimal channel configuration. In this paper, we try to determine the channel configuration of the pruned models by random search. The proposed approach provides a new way to compare different methods, namely how well they behave compared with random pruning. We show that this simple strategy works quite well compared with other channel pruning methods. We also show that under this setting, there are surprisingly no clear winners among different channel importance evaluation methods, which then may tilt the research efforts into advanced channel configuration searching methods.

READ FULL TEXT
03/17/2023

Dynamic Structure Pruning for Compressing CNNs

Structure pruning is an effective method to compress and accelerate neur...
01/24/2019

Really should we pruning after model be totally trained? Pruning based on a small amount of training

Pre-training of models in pruning algorithms plays an important role in ...
03/25/2019

MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning

In this paper, we propose a novel meta learning approach for automatic c...
11/10/2020

Stage-wise Channel Pruning for Model Compression

Auto-ML pruning methods aim at searching a pruning strategy automaticall...
05/21/2021

BCNet: Searching for Network Width with Bilaterally Coupled Network

Searching for a more compact network width recently serves as an effecti...
03/25/2022

Searching for Network Width with Bilaterally Coupled Network

Searching for a more compact network width recently serves as an effecti...
11/23/2020

Synthesis and Pruning as a Dynamic Compression Strategy for Efficient Deep Neural Networks

The brain is a highly reconfigurable machine capable of task-specific ad...

Please sign up or login with your details

Forgot password? Click here to reset