Revisiting Knowledge Distillation for Object Detection

05/22/2021
by   Amin Banitalebi-Dehkordi, et al.
0

The existing solutions for object detection distillation rely on the availability of both a teacher model and ground-truth labels. We propose a new perspective to relax this constraint. In our framework, a student is first trained with pseudo labels generated by the teacher, and then fine-tuned using labeled data, if any available. Extensive experiments demonstrate improvements over existing object detection distillation algorithms. In addition, decoupling the teacher and ground-truth distillation in this framework provides interesting properties such: as 1) using unlabeled data to further improve the student's performance, 2) combining multiple teacher models of different architectures, even with different object categories, and 3) reducing the need for labeled data (with only 20 performance as the model trained on the entire set of labels). Furthermore, a by-product of this approach is the potential usage for domain adaptation. We verify these properties through extensive experiments.

READ FULL TEXT

page 5

page 13

page 14

research
04/01/2022

Unified and Effective Ensemble Knowledge Distillation

Ensemble knowledge distillation can extract knowledge from multiple teac...
research
04/04/2022

Re-examining Distillation For Continual Object Detection

Training models continually to detect and classify objects, from new cla...
research
05/16/2018

Object detection at 200 Frames Per Second

In this paper, we propose an efficient and fast object detector which ca...
research
07/25/2019

Towards Generalizing Sensorimotor Control Across Weather Conditions

The ability of deep learning models to generalize well across different ...
research
10/13/2022

Weighted Distillation with Unlabeled Examples

Distillation with unlabeled examples is a popular and powerful method fo...
research
04/07/2021

Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression

Knowledge distillation (KD) has been actively studied for image classifi...
research
10/20/2021

Model Composition: Can Multiple Neural Networks Be Combined into a Single Network Using Only Unlabeled Data?

The diversity of deep learning applications, datasets, and neural networ...

Please sign up or login with your details

Forgot password? Click here to reset