Dream Distillation: A Data-Independent Model Compression Framework

05/17/2019
by   Kartikeya Bhardwaj, et al.
5

Model compression is eminently suited for deploying deep learning on IoT-devices. However, existing model compression techniques rely on access to the original or some alternate dataset. In this paper, we address the model compression problem when no real data is available, e.g., when data is private. To this end, we propose Dream Distillation, a data-independent model compression framework. Our experiments show that Dream Distillation can achieve 88.5 original data!

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset