Action Unit Detection with Joint Adaptive Attention and Graph Relation

07/09/2021
by   Chenggong Zhang, et al.
0

This paper describes an approach to the facial action unit (AU) detection. In this work, we present our submission to the Field Affective Behavior Analysis (ABAW) 2021 competition. The proposed method uses the pre-trained JAA model as the feature extractor, and extracts global features, face alignment features and AU local features on the basis of multi-scale features. We take the AU local features as the input of the graph convolution to further consider the correlation between AU, and finally use the fused features to classify AU. The detected accuracy was evaluated by 0.5*accuracy + 0.5*F1. Our model achieves 0.674 on the challenging Aff-Wild2 database.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2018

Deep Adaptive Attention for Joint Facial Action Unit Detection and Face Alignment

Facial action unit (AU) detection and face alignment are two highly corr...
research
02/04/2020

Multi-label Relation Modeling in Facial Action Units Detection

This paper describes an approach to the facial action units detections. ...
research
03/19/2023

Multi-modal Facial Action Unit Detection with Large Pre-trained Models for the 5th Competition on Affective Behavior Analysis in-the-wild

Facial action unit detection has emerged as an important task within fac...
research
11/11/2022

FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection

Due to its importance in facial behaviour analysis, facial action unit (...
research
03/16/2023

EmotiEffNet Facial Features in Uni-task Emotion Recognition in Video at ABAW-5 competition

In this article, the results of our team for the fifth Affective Behavio...
research
09/23/2020

LoRRaL: Facial Action Unit Detection Based on Local Region Relation Learning

End-to-end convolution representation learning has been proved to be ver...

Please sign up or login with your details

Forgot password? Click here to reset