An analysis of observation length requirements in spoken language for machine understanding of human behaviors

Automatic quantification of human interaction behaviors based on language information has been shown to be effective in psychotherapy research domains such as marital therapy and cancer care. Existing systems typically use a moving-window approach where the target behavior construct is first quantified based on observations inside a window, such as a fixed number of words or turns, and then integrated over all the windows in that interaction. Given a behavior of interest, it is important to employ the appropriate length of observation, since too short a window might not contain sufficient information. Unfortunately, the link between behavior and observation length for lexical cues has not been well studied and it is not clear how these requirements relate to the characteristics of the target behavior construct. Therefore, in this paper, we investigate how the choice of window length affects the efficacy of language-based behavior quantification, by analyzing (a) the similarity between system predictions and human expert assessments for the same behavior construct and (b) the consistency in relations between predictions of related behavior constructs. We apply our analysis to a large and diverse set of behavior codes that are used to annotate real-life interactions and find that behaviors related to negative affect can be quantified from just a few words whereas those related to positive traits and problem solving require much longer observation windows. On the other hand, constructs that describe dysphoric affect do not appear to be quantifiable from language information alone, regardless of how long they are observed. We compare our findings with related work on behavior quantification based on acoustic vocal cues as well as with prior work on thin slices and human personality predictions and find that, in general, they are in agreement.

READ FULL TEXT

page 15

page 17

page 19

page 21

page 22

page 23

research
11/21/2019

An analysis of observation length requirements for machine understanding of human behaviors in spoken language

Machine learning-based human behavior modeling, often at the level of ch...
research
08/02/2019

Predicting Behavior in Cancer-Afflicted Patient and Spouse Interactions using Speech and Language

Cancer impacts the quality of life of those diagnosed as well as their s...
research
05/23/2018

Modeling Interpersonal Influence of Verbal Behavior in Couples Therapy Dyadic Interactions

Dyadic interactions among humans are marked by speakers continuously inf...
research
06/22/2021

On Positivity Bias in Negative Reviews

Prior work has revealed that positive words occur more frequently than n...
research
06/22/2020

Characterizing Hirability via Personality and Behavior

While personality traits have been extensively modeled as behavioral con...
research
12/18/2022

Exploring Workplace Behaviors through Speaking Patterns using Large-scale Multimodal Wearable Recordings: A Study of Healthcare Providers

Interpersonal spoken communication is central to human interaction and t...
research
06/21/2021

Anticipatory Detection of Compulsive Body-focused Repetitive Behaviors with Wearables

Body-focused repetitive behaviors (BFRBs), like face-touching or skin-pi...

Please sign up or login with your details

Forgot password? Click here to reset