gms | German Medical Science

24. Jahrestagung der Deutschen Gesellschaft für Audiologie

Deutsche Gesellschaft für Audiologie e. V.

14.09. - 17.09.2022, Erfurt

Influence of natural head movements on communication behavior in virtual multi-talker conversations

Meeting Abstract

Suche in Medline nach

  • presenting/speaker Angelika Kothe - Carl von Ossietzky Universität Oldenburg, Oldenburg, DE
  • Volker Hohmann - Carl von Ossietzky Universität Oldenburg, Oldenburg, DE
  • Giso Grimm - Carl von Ossietzky Universität Oldenburg, Oldenburg, DE

Deutsche Gesellschaft für Audiologie e.V.. 24. Jahrestagung der Deutschen Gesellschaft für Audiologie. Erfurt, 14.-17.09.2022. Düsseldorf: German Medical Science GMS Publishing House; 2022. Doc155

doi: 10.3205/22dga155, urn:nbn:de:0183-22dga1552

Veröffentlicht: 12. September 2022

© 2022 Kothe et al.
Dieser Artikel ist ein Open-Access-Artikel und steht unter den Lizenzbedingungen der Creative Commons Attribution 4.0 License (Namensnennung). Lizenz-Angaben siehe http://creativecommons.org/licenses/by/4.0/.


Gliederung

Text

To better meet the needs of hearing device users in their every-day life, modern signal processing strategies aim for better performance in acoustically complex communication environments. Due to the interaction between hearing device performance and natural user behaviour such as self motion, adequate evaluation measurement methods are required.

Virtual measurement environments allow for quite realistic and complex, yet controllable conditions. Recently developed scenes have been reported to be similar to real life in some considered aspects, such as self-motion [1], [2] and offer some experience of involvement [3]. Still, differences in communication behaviour as well as reported dissatisfaction with the animation of human characters in the scenes [3] indicate that a further increase of involvement in the presented virtual environments may be beneficial. As human conversation includes non-acoustic communication, virtual conversation partners with natural movement, gestures and face expressions are expected to increase involvement and validity of behaviour.

In our work, we implemented the transmission of head movements from remote conversation partners to virtual animated characters in the lab in real time. We then evaluated the resulting effect on study participants and gained further insights into meaningful measurement conditions for realistic environments by conducting a human participant study.

The tracking of head movements and some aspects of face motion was done by a combination of motion sensors and a tracking software via camera. In a live triadic interaction with normal hearing participants and a plausible virtual scenario, conversation behaviour and experienced involvement was evaluated. Level of animation details and diffuse noise level served as independent variables. Indicators of involvement and communication behaviour were quantified in behavioural data and subjective ratings.

Preliminary results show that low-delay real time transmission of head movements and facial expressions is generally possible, and indicate a more natural perception of the virtual representation of conversation partners.


References

1.
Hendrikse MME, Llorach G, Grimm G, Hohmann V. Influence of visual cues on head and eye movements during listening tasks in multi-talker audiovisual environments with animated characters. Speech Communication. 2018; 101:70-84. DOI: 10.1016/j.specom.2018.05.008 Externer Link
2.
Hartwig M, Hohmann V, Grimm G. Speaking with avatars - influence of social interaction on movement behavior in interactive hearing experiments. IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 2021. DOI: 10.1109/VRW52623.2021.00025 Externer Link
3.
Hendrikse MME, Llorach G, Hohmann V, Grimm G. Movement and Gaze Behavior in Virtual Audiovisual Listening Environments Resembling Everyday Life. Trends Hear. 2019 Jan-Dec;23:2331216519872362. DOI: 10.1177/2331216519872362 Externer Link