A virtual reality-based method for examining audiovisual prosody perception

09/13/2022
by   Hartmut Meister, et al.
0

Prosody plays a vital role in verbal communication. Acoustic cues of prosody have been examined extensively. However, prosodic characteristics are not only perceived auditorily, but also visually based on head and facial movements. The purpose of this report is to present a method for examining audiovisual prosody using virtual reality. We show that animations based on a virtual human provide motion cues similar to those obtained from video recordings of a real talker. The use of virtual reality opens up new avenues for examining multimodal effects of verbal communication. We discuss the method in the framework of examining prosody perception in cochlear implant listeners.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset