Multi-scale Knowledge Distillation for Unsupervised Person Re-Identification

04/21/2022
by   Long Lan, et al.
0

Unsupervised person re-identification is a challenging and promising task in the computer vision. Nowadays unsupervised person re-identification methods have achieved great improvements by training with pseudo labels. However, the appearance and label noise are less explicitly studied in the unsupervised manner. To relieve the effects of appearance noise the global features involved, we also take into account the features from two local views and produce multi-scale features. We explore the knowledge distillation to filter label noise, Specifically, we first train a teacher model from noisy pseudo labels in a iterative way, and then use the teacher model to guide the learning of our student model. In our setting, the student model could converge fast in the supervision of the teacher model thus reduce the interference of noisy labels as the teacher model greatly suffered. After carefully handling the noises in the feature learning, Our multi-scale knowledge distillation are proven to be very effective in the unsupervised re-identification. Extensive experiments on three popular person re-identification datasets demonstrate the superiority of our method. Especially, our approach achieves a state-of-the-art accuracy 85.7 with ResNet-50 under the fully unsupervised setting.

READ FULL TEXT

page 1

page 4

research
11/27/2020

Enhancing Diversity in Teacher-Student Networks via Asymmetric branches for Unsupervised Person Re-identification

The objective of unsupervised person re-identification (Re-ID) is to lea...
research
06/08/2023

Population-Based Evolutionary Gaming for Unsupervised Person Re-identification

Unsupervised person re-identification has achieved great success through...
research
02/07/2022

ALM-KD: Knowledge Distillation with noisy labels via adaptive loss mixing

Knowledge distillation is a technique where the outputs of a pretrained ...
research
09/03/2019

Knowledge Distillation for End-to-End Person Search

We introduce knowledge distillation for end-to-end person search. End-to...
research
12/17/2019

In Defense of the Triplet Loss Again: Learning Robust Person Re-Identification with Fast Approximated Triplet Loss and Label Distillation

The comparative losses (typically, triplet loss) are appealing choices f...
research
01/15/2020

Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification

Object re-identification (re-id) aims to identify a specific object acro...
research
11/20/2018

Factorized Distillation: Training Holistic Person Re-identification Model by Distilling an Ensemble of Partial ReID Models

Person re-identification (ReID) is aimed at identifying the same person ...

Please sign up or login with your details

Forgot password? Click here to reset