Identifying Emotions from Walking using Affective and Deep Features

06/14/2019
by   Tanmay Randhavane, et al.
0

We present a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles. Given an RGB video of an individual walking, we extract his/her walking gait in the form of a series of 3D poses. Our goal is to exploit the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. Our perceived emotion recognition approach is based on using deep features learned via LSTM on labeled emotion datasets. Furthermore, we combine these features with affective features computed from the gaits using posture and movement cues. These features are classified using a Random Forest Classifier. We show that our mapping between the combined feature space and the perceived emotional state provides 80.07 In addition to classifying discrete categories of emotions, our algorithm also predicts the values of perceived valence and arousal from gaits. We also present an "EWalk (Emotion Walk)" dataset that consists of videos of walking individuals with gaits and labeled emotions. To the best of our knowledge, this is the first gait-based model to identify perceived emotions from videos of walking individuals.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset