Speech, Head, and Eye-based Cues for Continuous Affect Prediction

07/23/2019
by   Jonny O'Dwyer, et al.
0

Continuous affect prediction involves the discrete time-continuous regression of affect dimensions. Dimensions to be predicted often include arousal and valence. Continuous affect prediction researchers are now embracing multimodal model input. This provides motivation for researchers to investigate previously unexplored affective cues. Speech-based cues have traditionally received the most attention for affect prediction, however, non-verbal inputs have significant potential to increase the performance of affective computing systems and in addition, allow affect modelling in the absence of speech. However, non-verbal inputs that have received little attention for continuous affect prediction include eye and head-based cues. The eyes are involved in emotion displays and perception while head-based cues have been shown to contribute to emotion conveyance and perception. Additionally, these cues can be estimated non-invasively from video, using modern computer vision tools. This work exploits this gap by comprehensively investigating head and eye-based features and their combination with speech for continuous affect prediction. Hand-crafted, automatically generated and CNN-learned features from these modalities will be investigated for continuous affect prediction. The highest performing feature sets and feature set combinations will answer how effective these features are for the prediction of an individual's affective state.

READ FULL TEXT

page 1

page 3

research
07/23/2019

Eye-based Continuous Affect Prediction

Eye-based information channels include the pupils, gaze, saccades, fixat...
research
03/05/2018

Continuous Affect Prediction Using Eye Gaze and Speech

Affective computing research traditionally focused on labeling a person'...
research
06/29/2023

CORAE: A Tool for Intuitive and Continuous Retrospective Evaluation of Interactions

This paper introduces CORAE, a novel web-based open-source tool for COnt...
research
10/26/2022

Acoustically-Driven Phoneme Removal That Preserves Vocal Affect Cues

In this paper, we propose a method for removing linguistic information f...
research
05/17/2018

Affective computing using speech and eye gaze: a review and bimodal system proposal for continuous affect prediction

Speech has been a widely used modality in the field of affective computi...
research
06/09/2022

GASP: Gated Attention For Saliency Prediction

Saliency prediction refers to the computational task of modeling overt a...
research
08/14/2018

Looking Beyond a Clever Narrative: Visual Context and Attention are Primary Drivers of Affect in Video Advertisements

Emotion evoked by an advertisement plays a key role in influencing brand...

Please sign up or login with your details

Forgot password? Click here to reset