Knowledge Distillation from Ensemble of Offsets for Head Pose Estimation

08/20/2021
by   Andrey Sheka, et al.
1

This paper proposes a method for estimating the head pose from a single image. This estimation uses a neural network (NN) obtained in two stages. In the first stage, we trained the base NN, which has one regression head and four regression via classification (RvC) heads. We build the ensemble of offsets using small offsets of face bounding boxes. In the second stage, we perform knowledge distillation (KD) from the ensemble of offsets of the base NN into the final NN with one RvC head. On the main test protocol, the use of the offset ensemble improves the results of the base NN, and the KD improves the results from the offset ensemble. The KD improves the results by an average of 7.7% compared to the non-ensemble version. The proposed NN on the main test protocol improves the state-of-the-art result on AFLW2000 and approaches, with only a minimal gap, the state-of-the-art result on BIWI. Our NN uses only head pose data, but the previous state-of-the-art model also uses facial landmarks during training. We have made publicly available trained NNs and face bounding boxes for the 300W-LP, AFLW, AFLW2000, and BIWI datasets. KD-ResNet152 has the best results, and KD-ResNet18 has a better result on the AFLW2000 dataset than any previous method.

READ FULL TEXT

page 1

page 3

page 5

page 7

page 8

research
03/31/2016

Robust Head-Pose Estimation Based on Partially-Latent Mixture of Linear Regressions

Head-pose estimation has many applications, such as social event analysi...
research
10/25/2022

An Effective Deep Network for Head Pose Estimation without Keypoints

Human head pose estimation is an essential problem in facial analysis in...
research
01/21/2019

Hybrid coarse-fine classification for head pose estimation

Head pose estimation, which computes the intrinsic Euler angles (yaw, pi...
research
05/19/2022

Simple Regularisation for Uncertainty-Aware Knowledge Distillation

Considering uncertainty estimation of modern neural networks (NNs) is on...
research
07/04/2019

Graph-based Knowledge Distillation by Multi-head Attention Network

Knowledge distillation (KD) is a technique to derive optimal performance...
research
03/17/2020

KPNet: Towards Minimal Face Detector

The small receptive field and capacity of minimal neural networks limit ...
research
11/19/2017

MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images

This paper is aimed at creating extremely small and fast convolutional n...

Please sign up or login with your details

Forgot password? Click here to reset