gms | German Medical Science

Gesundheit – gemeinsam. Kooperationstagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie (GMDS), Deutschen Gesellschaft für Sozialmedizin und Prävention (DGSMP), Deutschen Gesellschaft für Epidemiologie (DGEpi), Deutschen Gesellschaft für Medizinische Soziologie (DGMS) und der Deutschen Gesellschaft für Public Health (DGPH)

08.09. - 13.09.2024, Dresden

Automation Bias in AI-Decision Support: Results from an Empirical Study

Meeting Abstract

  • Florian Kücking - Hochschule Osnabrück - University of Applied Sciences, Osnabrück, Germany
  • Ursula Hertha Hübner - Hochschule Osnabrück - University of Applied Sciences, Osnabrück, Germany
  • Mareike Przysucha - Hochschule Osnabrück - University of applied sciences, Osnabrück, Germany
  • Niels Hannemann - Universität Osnabrück, Abteilung New Public Health, Osnabrück, Germany
  • Jan-Oliver Kutza - Hochschule Osnabrück - University of Applied Sciences, Osnabrück, Germany
  • Maurice Moelleken - Universitätsklinikum Essen, Essen, Germany
  • Cornelia Erfurt-Berge - Universitätsklinikum Erlangen, Erlangen, Germany
  • Joachim Dissemond - Universitätsklinikum Essen, Essen, Germany
  • Birgit Babitsch - Universität Osnabrück, Abteilung New Public Health, Osnabrück, Germany
  • Dorothee Busch - Hochschule Osnabrück - University of Applied Sciences, Osnabrück, Germany

Gesundheit – gemeinsam. Kooperationstagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie (GMDS), Deutschen Gesellschaft für Sozialmedizin und Prävention (DGSMP), Deutschen Gesellschaft für Epidemiologie (DGEpi), Deutschen Gesellschaft für Medizinische Soziologie (DGMS) und der Deutschen Gesellschaft für Public Health (DGPH). Dresden, 08.-13.09.2024. Düsseldorf: German Medical Science GMS Publishing House; 2024. DocAbstr. 307

doi: 10.3205/24gmds062, urn:nbn:de:0183-24gmds0625

Published: September 6, 2024

© 2024 Kücking et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.


Outline

Text

Introduction: Automation bias poses a significant challenge to the effectiveness of Clinical Decision Support Systems (CDSS), potentially compromising diagnostic accuracy. Previous research highlights trust, self-confidence, and task difficulty as key determinants. With the increasing availability of AI-enabled CDSS, automation bias attains new attention. This study therefore aims to identify factors influencing automation bias in a diagnostic task.

Methods: A quantitative intervention study with participants from different backgrounds (n = 210) was conducted, employing regression analysis to analyze potential factors. Automation bias was measured as the agreement rate with wrong AI-enabled recommendations.

Results and discussion: Diagnostic performance, certified wound care training, physician profession, and female gender significantly reduced false agreement rates. Higher perceived benefit of the system was significantly associated with promoting false agreement. Strategies like comprehensive diagnostic training are pivotal in the prevention of automation bias when implementing CDSS.

Conclusion: Considering factors influencing automation bias when introducing a CDSS is critical to fully leverage the benefits of such a system. This study highlights that non-specialists, who stand to gain the most from CDSS, are also the most susceptible to automation bias, emphasizing the need for specialized training to mitigate this risk and ensure diagnostic accuracy and patient safety.

The authors declare that they have no competing interests.

The authors declare that a positive ethics committee vote has been obtained.