Soft Weight-Sharing for Neural Network Compression

02/13/2017
by   Karen Ullrich, et al.
0

The success of deep learning in numerous application domains created the de- sire to run and train them on mobile devices. This however, conflicts with their computationally, memory and energy intense nature, leading to a growing interest in compression. Recent work by Han et al. (2015a) propose a pipeline that involves retraining, pruning and quantization of neural network weights, obtaining state-of-the-art compression rates. In this paper, we show that competitive compression rates can be achieved by using a version of soft weight-sharing (Nowlan & Hinton, 1992). Our method achieves both quantization and pruning in one simple (re-)training procedure. This point of view also exposes the relation between compression and the minimum description length (MDL) principle.

READ FULL TEXT
research
04/03/2017

Soft-to-Hard Vector Quantization for End-to-End Learning Compressible Representations

We present a new approach to learn compressible representations in deep ...
research
11/17/2017

Improved Bayesian Compression

Compression of Neural Networks (NN) has become a highly studied topic in...
research
06/14/2018

Scalable Neural Network Compression and Pruning Using Hard Clustering and L1 Regularization

We propose a simple and easy to implement neural network compression alg...
research
05/24/2017

Bayesian Compression for Deep Learning

Compression and computational efficiency in deep learning have become a ...
research
12/18/2018

Entropy-Constrained Training of Deep Neural Networks

We propose a general framework for neural network compression that is mo...
research
05/26/2021

Dynamic Probabilistic Pruning: A general framework for hardware-constrained pruning at different granularities

Unstructured neural network pruning algorithms have achieved impressive ...
research
09/30/2018

Minimal Random Code Learning: Getting Bits Back from Compressed Model Parameters

While deep neural networks are a highly successful model class, their la...

Please sign up or login with your details

Forgot password? Click here to reset