Activation Map Adaptation for Effective Knowledge Distillation

10/26/2020
by   Zhiyuan Wu, et al.
0

Model compression becomes a recent trend due to the requirement of deploying neural networks on embedded and mobile devices. Hence, both accuracy and efficiency are of critical importance. To explore a balance between them, a knowledge distillation strategy is proposed for general visual representation learning. It utilizes our well-designed activation map adaptive module to replace some blocks of the teacher network, exploring the most appropriate supervisory features adaptively during the training process. Using the teacher's hidden layer output to prompt the student network to train so as to transfer effective semantic information.To verify the effectiveness of our strategy, this paper applied our method to cifar-10 dataset. Results demonstrate that the method can boost the accuracy of the student network by 0.6

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2019

Highlight Every Step: Knowledge Distillation via Collaborative Teaching

High storage and computational costs obstruct deep neural networks to be...
research
04/19/2023

Knowledge Distillation Under Ideal Joint Classifier Assumption

Knowledge distillation is a powerful technique to compress large neural ...
research
11/08/2018

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons

An activation boundary for a neuron refers to a separating hyperplane th...
research
03/25/2021

Spirit Distillation: Precise Real-time Prediction with Insufficient Data

Recent trend demonstrates the effectiveness of deep neural networks (DNN...
research
09/17/2021

Distilling Linguistic Context for Language Model Compression

A computationally expensive and memory intensive neural network lies beh...
research
09/15/2022

Layerwise Bregman Representation Learning with Applications to Knowledge Distillation

In this work, we propose a novel approach for layerwise representation l...

Please sign up or login with your details

Forgot password? Click here to reset