Data-Dependent Coresets for Compressing Neural Networks with Applications to Generalization Bounds

04/15/2018
by   Cenk Baykal, et al.
1

The deployment of state-of-the-art neural networks containing millions of parameters to resource-constrained platforms may be prohibitive in terms of both time and space. In this work, we present an efficient coresets-based neural network compression algorithm that provably sparsifies the parameters of a trained feedforward neural network in a manner that approximately preserves the network's output. Our approach is based on an importance sampling scheme that judiciously defines a sampling distribution over the neural network parameters, and as a result, retains parameters of high importance while discarding redundant ones. Our method and analysis introduce an empirical notion of sensitivity and extend traditional coreset constructions to the application of compressing parameters. Our theoretical analysis establishes both instance-dependent and -independent bounds on the size of the resulting compressed neural network as a function of the user-specified tolerance and failure probability parameters. As a corollary to our practical compression algorithm, we obtain novel generalization bounds that may provide novel insights on the generalization properties of neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2019

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

We introduce a pruning algorithm that provably sparsifies the parameters...
research
11/24/2022

PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization

While there has been progress in developing non-vacuous generalization b...
research
01/14/2020

Understanding Generalization in Deep Learning via Tensor Methods

Deep neural networks generalize well on unseen data though the number of...
research
12/08/2021

Generalization Error Bounds for Iterative Recovery Algorithms Unfolded as Neural Networks

Motivated by the learned iterative soft thresholding algorithm (LISTA), ...
research
07/09/2019

On Activation Function Coresets for Network Pruning

Model compression provides a means to efficiently deploy deep neural net...
research
04/20/2022

Investigating the Optimal Neural Network Parameters for Decoding

Neural Networks have been proved to work as decoders in telecommunicatio...

Please sign up or login with your details

Forgot password? Click here to reset