Dream Distillation: A Data-Independent Model Compression Framework

05/17/2019
by   Kartikeya Bhardwaj, et al.
5

Model compression is eminently suited for deploying deep learning on IoT-devices. However, existing model compression techniques rely on access to the original or some alternate dataset. In this paper, we address the model compression problem when no real data is available, e.g., when data is private. To this end, we propose Dream Distillation, a data-independent model compression framework. Our experiments show that Dream Distillation can achieve 88.5 original data!

READ FULL TEXT
research
01/21/2022

Can Model Compression Improve NLP Fairness

Model compression techniques are receiving increasing attention; however...
research
10/19/2020

New Properties of the Data Distillation Method When Working With Tabular Data

Data distillation is the problem of reducing the volume oftraining data ...
research
12/10/2020

Large-Scale Generative Data-Free Distillation

Knowledge distillation is one of the most popular and effective techniqu...
research
11/11/2015

Unifying distillation and privileged information

Distillation (Hinton et al., 2015) and privileged information (Vapnik & ...
research
12/18/2019

TOCO: A Framework for Compressing Neural Network Models Based on Tolerance Analysis

Neural network compression methods have enabled deploying large models o...
research
04/12/2021

Generalization bounds via distillation

This paper theoretically investigates the following empirical phenomenon...

Please sign up or login with your details

Forgot password? Click here to reset