Artikel
Perception of emotional expression in cochlear implant users
Suche in Medline nach
Autoren
Veröffentlicht: | 18. März 2025 |
---|
Gliederung
Text
Speech comprehension counts as a benchmark outcome of cochlear implants (CIs), but this disregards the communicative importance of non-verbal social-communicative vocal signals. Accordingly, CI users’ ability to recognize vocal emotions remains strikingly understudied, even though this ability is essential and closely connected to quality of life in CI users. To fill this knowledge gap, we investigated vocal emotion perception in CI users and effects of facial information on this ability. In all experiments, we utilized state-of-the-art voice morphing methods to precisely control acoustic parameters in voice recordings.
Across experiments, CI users showed lower performance than normal-hearing (NH) individuals in vocal emotion perception overall, with or without facial information. Importantly, there were large interindividual differences among CI users, with low performers responding close to guessing level. Whereas NH individuals utilized timbre and fundamental frequency information to equivalent degrees when recognizing vocal emotions, CI users were more efficient in using timbre (compared to fundamental frequency) information for the same task. Some CI users could use timbre information remarkably well, demonstrating that CI devices can efficiently transmit timbre signals. Crucially, considering that emotion perception with a CI can be improved by vocal caricatures, we developed and tested a perceptual training program with caricatures as training stimuli – with promising results. We also created a substantial audiovisual (AV) database for emotional voice and dynamic face stimuli (with voices varying in emotional intensity via different morph levels, allowing adaptive testing and calibration of task difficulty) to study AV emotion perception in CI users. Compared to NH individuals, CI users exhibited stronger benefits to vocal emotion perception if time-synchronized congruent facial emotional information was available, and these larger crossmodal benefits were maintained even at equal auditory-only performance levels. Importantly, these results suggest the benefits result from deafness-related compensation rather than degraded acoustic representations. Finally, the findings confirmed the positive relationship between vocal emotion recognition abilities and quality of life ratings in CI users.
Overall, the current studies suggest AV stimuli are beneficial during CI rehabilitation. Moreover, they demonstrate that morphing, and specifically caricaturing, provides novel perspectives not only for assessing sensory determinants of human communication but also for improving perception of emotional expression and, ultimately, quality of life in CI users.