Rethinking the Knowledge Distillation From the Perspective of Model Calibration

10/31/2021
by   Lehan Yang, et al.
0

Recent years have witnessed dramatically improvements in the knowledge distillation, which can generate a compact student model for better efficiency while retaining the model effectiveness of the teacher model. Previous studies find that: more accurate teachers do not necessary make for better teachers due to the mismatch of abilities. In this paper, we aim to analysis the phenomenon from the perspective of model calibration. We found that the larger teacher model may be too over-confident, thus the student model cannot effectively imitate. While, after the simple model calibration of the teacher model, the size of the teacher model has a positive correlation with the performance of the student model.

READ FULL TEXT

page 1

page 2

research
06/10/2021

Does Knowledge Distillation Really Work?

Knowledge distillation is a popular technique for training a small stude...
research
10/22/2022

Hard Gate Knowledge Distillation – Leverage Calibration for Robust and Reliable Language Model

In knowledge distillation, a student model is trained with supervisions ...
research
06/07/2023

Faithful Knowledge Distillation

Knowledge distillation (KD) has received much attention due to its succe...
research
04/15/2023

Teacher Network Calibration Improves Cross-Quality Knowledge Distillation

We investigate cross-quality knowledge distillation (CQKD), a knowledge ...
research
12/25/2022

BD-KD: Balancing the Divergences for Online Knowledge Distillation

Knowledge distillation (KD) has gained a lot of attention in the field o...
research
05/03/2023

SCOTT: Self-Consistent Chain-of-Thought Distillation

Large language models (LMs) beyond a certain scale, demonstrate the emer...
research
03/14/2023

Teacher-Student Knowledge Distillation for Radar Perception on Embedded Accelerators

Many radar signal processing methodologies are being developed for criti...

Please sign up or login with your details

Forgot password? Click here to reset