HEAD: HEtero-Assists Distillation for Heterogeneous Object Detectors

07/12/2022
by   Luting Wang, et al.
0

Conventional knowledge distillation (KD) methods for object detection mainly concentrate on homogeneous teacher-student detectors. However, the design of a lightweight detector for deployment is often significantly different from a high-capacity detector. Thus, we investigate KD among heterogeneous teacher-student pairs for a wide application. We observe that the core difficulty for heterogeneous KD (hetero-KD) is the significant semantic gap between the backbone features of heterogeneous detectors due to the different optimization manners. Conventional homogeneous KD (homo-KD) methods suffer from such a gap and are hard to directly obtain satisfactory performance for hetero-KD. In this paper, we propose the HEtero-Assists Distillation (HEAD) framework, leveraging heterogeneous detection heads as assistants to guide the optimization of the student detector to reduce this gap. In HEAD, the assistant is an additional detection head with the architecture homogeneous to the teacher head attached to the student backbone. Thus, a hetero-KD is transformed into a homo-KD, allowing efficient knowledge transfer from the teacher to the student. Moreover, we extend HEAD into a Teacher-Free HEAD (TF-HEAD) framework when a well-trained teacher detector is unavailable. Our method has achieved significant improvement compared to current detection KD methods. For example, on the MS-COCO dataset, TF-HEAD helps R18 RetinaNet achieve 33.9 mAP (+2.2), while HEAD further pushes the limit to 36.2 mAP (+4.5).

READ FULL TEXT

page 2

page 13

research
07/05/2022

PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient

Knowledge distillation(KD) is a widely-used technique to train compact m...
research
05/06/2019

Creating Lightweight Object Detectors with Model Compression for Deployment on Edge Devices

To achieve lightweight object detectors for deployment on the edge devic...
research
06/23/2020

Distilling Object Detectors with Task Adaptive Regularization

Current state-of-the-art object detectors are at the expense of high com...
research
09/23/2021

LGD: Label-guided Self-distillation for Object Detection

In this paper, we propose the first self-distillation framework for gene...
research
05/31/2022

itKD: Interchange Transfer-based Knowledge Distillation for 3D Object Detection

Recently, point-cloud based 3D object detectors have achieved remarkable...
research
10/07/2022

IDa-Det: An Information Discrepancy-aware Distillation for 1-bit Detectors

Knowledge distillation (KD) has been proven to be useful for training co...
research
08/20/2023

Representation Disparity-aware Distillation for 3D Object Detection

In this paper, we focus on developing knowledge distillation (KD) for co...

Please sign up or login with your details

Forgot password? Click here to reset