Vision based body gesture meta features for Affective Computing

02/10/2020
by   Indigo J. D. Orton, et al.
0

Early detection of psychological distress is key to effective treatment. Automatic detection of distress, such as depression, is an active area of research. Current approaches utilise vocal, facial, and bodily modalities. Of these, the bodily modality is the least investigated, partially due to the difficulty in extracting bodily representations from videos, and partially due to the lack of viable datasets. Existing body modality approaches use automatic categorization of expressions to represent body language as a series of specific expressions, much like words within natural language. In this dissertation I present a new type of feature, within the body modality, that represents meta information of gestures, such as speed, and use it to predict a non-clinical depression label. This differs to existing work by representing overall behaviour as a small set of aggregated meta features derived from a person's movement. In my method I extract pose estimation from videos, detect gestures within body parts, extract meta information from individual gestures, and finally aggregate these features to generate a small feature vector for use in prediction tasks. I introduce a new dataset of 65 video recordings of interviews with self-evaluated distress, personality, and demographic labels. This dataset enables the development of features utilising the whole body in distress detection tasks. I evaluate my newly introduced meta-features for predicting depression, anxiety, perceived stress, somatic stress, five standard personality measures, and gender. A linear regression based classifier using these features achieves a 82.70 novel dataset.

READ FULL TEXT
research
07/31/2020

Looking At The Body: Automatic Analysis of Body Gestures and Self-Adaptors in Psychological Distress

Psychological distress is a significant and growing issue in society. Au...
research
01/23/2018

Survey on Emotional Body Gesture Recognition

Automatic emotion recognition has become a trending research topic in th...
research
05/25/2023

MPE4G: Multimodal Pretrained Encoder for Co-Speech Gesture Generation

When virtual agents interact with humans, gestures are crucial to delive...
research
01/26/2021

Text2Gestures: A Transformer-Based Network for Generating Emotive Body Gestures for Virtual Agents

We present Text2Gestures, a transformer-based learning method to interac...
research
01/07/2023

Towards early prediction of neurodevelopmental disorders: Computational model for Face Touch and Self-adaptors in Infants

Infants' neurological development is heavily influenced by their motor s...
research
08/25/2022

The ReprGesture entry to the GENEA Challenge 2022

This paper describes the ReprGesture entry to the Generation and Evaluat...
research
04/14/2015

Automated Analysis and Prediction of Job Interview Performance

We present a computational framework for automatically quantifying verba...

Please sign up or login with your details

Forgot password? Click here to reset