Discriminability Distillation in Group Representation Learning

08/25/2020
by   Manyuan Zhang, et al.
0

Learning group representation is a commonly concerned issue in tasks where the basic unit is a group, set, or sequence. Previously, the research community tries to tackle it by aggregating the elements in a group based on an indicator either defined by humans such as the quality and saliency, or generated by a black box such as the attention score. This article provides a more essential and explicable view. We claim the most significant indicator to show whether the group representation can be benefited from one of its element is not the quality or an inexplicable score, but the discriminability w.r.t. the model. We explicitly design the discrimiability using embedded class centroids on a proxy set. We show the discrimiability knowledge has good properties that can be distilled by a light-weight distillation network and can be generalized on the unseen target set. The whole procedure is denoted as discriminability distillation learning (DDL). The proposed DDL can be flexibly plugged into many group-based recognition tasks without influencing the original training procedures. Comprehensive experiments on various tasks have proven the effectiveness of DDL for both accuracy and efficiency. Moreover, it pushes forward the state-of-the-art results on these tasks by an impressive margin.

READ FULL TEXT
research
05/14/2018

Deep Attentional Structured Representation Learning for Visual Recognition

Structured representations, such as Bags of Words, VLAD and Fisher Vecto...
research
12/01/2019

Online Knowledge Distillation with Diverse Peers

Distillation is an effective knowledge-transfer technique that uses pred...
research
09/19/2020

Weight Distillation: Transferring the Knowledge in Neural Network Parameters

Knowledge distillation has been proven to be effective in model accelera...
research
12/20/2019

The State of Knowledge Distillation for Classification

We survey various knowledge distillation (KD) strategies for simple clas...
research
06/12/2019

Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation

In this work we aim to obtain computationally-efficient uncertainty esti...
research
03/14/2019

Notation for Subject Answer Analysis

It is believed that consistent notation helps the research community in ...

Please sign up or login with your details

Forgot password? Click here to reset