One-shot Network Pruning at Initialization with Discriminative Image Patches

by   Yinan Yang, et al.

One-shot Network Pruning at Initialization (OPaI) is an effective method to decrease network pruning costs. Recently, there is a growing belief that data is unnecessary in OPaI. However, we obtain an opposite conclusion by ablation experiments in two representative OPaI methods, SNIP and GraSP. Specifically, we find that informative data is crucial to enhancing pruning performance. In this paper, we propose two novel methods, Discriminative One-shot Network Pruning (DOP) and Super Stitching, to prune the network by high-level visual discriminative image patches. Our contributions are as follows. (1) Extensive experiments reveal that OPaI is data-dependent. (2) Super Stitching performs significantly better than the original OPaI method on benchmark ImageNet, especially in a highly compressed model.


page 2

page 5

page 6


Sparse Training via Boosting Pruning Plasticity with Neuroregeneration

Works on lottery ticket hypothesis (LTH) and single-shot network pruning...

Picking Winning Tickets Before Training by Preserving Gradient Flow

Overparameterization has been shown to benefit both the optimization and...

Why is Pruning at Initialization Immune to Reinitializing and Shuffling?

Recent studies assessing the efficacy of pruning neural networks methods...

Prospect Pruning: Finding Trainable Weights at Initialization using Meta-Gradients

Pruning neural networks at initialization would enable us to find sparse...

Advancing Model Pruning via Bi-level Optimization

The deployment constraints in practical applications necessitate the pru...

Progressive Skeletonization: Trimming more fat from a network at initialization

Recent studies have shown that skeletonization (pruning parameters) of n...

Unsupervised Discovery of Mid-Level Discriminative Patches

The goal of this paper is to discover a set of discriminative patches wh...

Please sign up or login with your details

Forgot password? Click here to reset