Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification

01/06/2020
by   Liuyu Xiang, et al.
0

In real-world scenarios, data tends to exhibit a long-tailed, imbalanced distribution. Developing algorithms to deal with such long-tailed distribution thus becomes indispensable in practical applications. In this paper, we propose a novel self-paced knowledge distillation framework, termed Learning From Multiple Experts (LFME). Our method is inspired by the observation that deep Convolutional Neural Networks (CNNs) trained on less imbalanced subsets of the entire long-tailed distribution often yield better performances than their jointly-trained counterparts. We refer to these models as `Expert Models', and the proposed LFME framework aggregates the knowledge from multiple `Experts' to learn a unified student model. Specifically, the proposed framework involves two levels of self-paced learning schedules: Self-paced Expert Selection and Self-paced Instance Selection, so that the knowledge is adaptively transferred from multiple `Experts' to the `Student'. In order to verify the effectiveness of our proposed framework, we conduct extensive experiments on two long-tailed benchmark classification datasets. The experimental results demonstrate that our method is able to achieve superior performances compared to the state-of-the-art methods. We also show that our method can be easily plugged into state-of-the-art long-tailed classification algorithms for further improvements.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset