Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies

by   Eric Chu, et al.

Stories can have tremendous power -- not only useful for entertainment, they can activate our interests and mobilize our actions. The degree to which a story resonates with its audience may be in part reflected in the emotional journey it takes the audience upon. In this paper, we use machine learning methods to construct emotional arcs in movies, calculate families of arcs, and demonstrate the ability for certain arcs to predict audience engagement. The system is applied to Hollywood films and high quality shorts found on the web. We begin by using deep convolutional neural networks for audio and visual sentiment analysis. These models are trained on both new and existing large-scale datasets, after which they can be used to compute separate audio and visual emotional arcs. We then crowdsource annotations for 30-second video clips extracted from highs and lows in the arcs in order to assess the micro-level precision of the system, with precision measured in terms of agreement in polarity between the system's predictions and annotators' ratings. These annotations are also used to combine the audio and visual predictions. Next, we look at macro-level characterizations of movies by investigating whether there exist `universal shapes' of emotional arcs. In particular, we develop a clustering approach to discover distinct classes of emotional arcs. Finally, we show on a sample corpus of short web videos that certain emotional arcs are statistically significant predictors of the number of comments a video receives. These results suggest that the emotional arcs learned by our approach successfully represent macroscopic aspects of a video story that drive audience engagement. Such machine understanding could be used to predict audience reactions to video stories, ultimately improving our ability as storytellers to communicate with each other.


page 1

page 2

page 3

page 4


Story Understanding in Video Advertisements

In order to resonate with the viewers, many video advertisements explore...

Bag of States: A Non-sequential Approach to Video-based Engagement Measurement

Automatic measurement of student engagement provides helpful information...

Statistical Selection of CNN-Based Audiovisual Features for Instantaneous Estimation of Human Emotional States

Automatic prediction of continuous-level emotional state requires select...

GLA in MediaEval 2018 Emotional Impact of Movies Task

The visual and audio information from movies can evoke a variety of emot...

How Would The Viewer Feel? Estimating Wellbeing From Video Scenarios

In recent years, deep neural networks have demonstrated increasingly str...

Visual Sentiment Analysis: A Natural DisasterUse-case Task at MediaEval 2021

The Visual Sentiment Analysis task is being offered for the first time a...

The Hidden Shape of Stories Reveals Positivity Bias and Gender Bias

To capture the shape of stories is crucial for understanding the mind of...

Please sign up or login with your details

Forgot password? Click here to reset