Artikel
Examination of audiovisual prosody
Suche in Medline nach
Autoren
Veröffentlicht: | 1. März 2023 |
---|
Gliederung
Text
Background: Prosody plays a vital role in verbal communication. It is important for the expression of emotions (affective prosody), but also carries speech information related to word and sentence stress or distinction between questions and statements (linguistic prosody). Acoustic cues of prosody, such as intonation patterns, intensity, and duration have been examined extensively. However, prosodic characteristics are not only perceived auditorily but also visually, as head and facial movements accompany the production of prosody. Still, in comparison to auditory cues multimodal mechanisms are poorly understood.
Methods: Whereas controlled manipulations of acoustic cues are a valuable method for uncovering and quantifying prosody perception, such an approach is much more complicated for visual prosody. Here, we describe a novel approach based on video-realistic animations via virtual humans. Such a method has the advantage that – in parallel to acoustic manipulations – head and facial movements can be parametrized. Eyetracking can be used to determine the gaze of the observers and pupillometry may reflect the load associated with perceiving prosodic cues.
Results: Animations based on a virtual human are generally capable of providing similar motion cues as gained by a video recording of a real talker. Parametrization yields a fine-grained manipulation of visual prosody cues, which can be applied on single elements in isolation. Initial results of cochlear implant users examined with the described method are presented.
Conclusions: Applying virtual humans opens up new avenues for exploring mechanisms behind multimodal verbal communication. Specifically, we discuss the application of this technique in the framework of examining prosody perception in listeners with cochlear implants, who are limited in their use of auditory cues. Supported by Deutsche Forschungsgemeinschaft, grant ME 2751-4.1 to HM.
References
- 1.
- Meister H, Winter IS, Wächtler M, Sandmann P, Abdellatif KH. A virtual reality-based method for examining audiovisual prosody perception. 2022. Available from: https://arxiv.org/abs/2209.05745