gms | German Medical Science

67. Jahrestagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie e. V. (GMDS), 13. Jahreskongress der Technologie- und Methodenplattform für die vernetzte medizinische Forschung e. V. (TMF)

21.08. - 25.08.2022, online

Reporting quality and risk of bias in studies evaluating the diagnostic test accuracy of clinical decision support systems: a systematic review of current practice

Meeting Abstract

  • Julia Böhnke - Institut für Epidemiologie und Sozialmedizin, Universität Münster, Münster, Germany
  • Julian Varghese - Westfälische Wilhelms-Universität Münster, Institut für Medizinische Informatik, Münster, Germany
  • André Karch - Institut für Epidemiologie und Sozialmedizin, Universität Münster, Münster, Germany
  • Nicole Rübsamen - Institut für Epidemiologie und Sozialmedizin, Universität Münster, Münster, Germany

Deutsche Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie. 67. Jahrestagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie e. V. (GMDS), 13. Jahreskongress der Technologie- und Methodenplattform für die vernetzte medizinische Forschung e.V. (TMF). sine loco [digital], 21.-25.08.2022. Düsseldorf: German Medical Science GMS Publishing House; 2022. DocAbstr. 81

doi: 10.3205/22gmds006, urn:nbn:de:0183-22gmds0065

Veröffentlicht: 19. August 2022

© 2022 Böhnke et al.
Dieser Artikel ist ein Open-Access-Artikel und steht unter den Lizenzbedingungen der Creative Commons Attribution 4.0 License (Namensnennung). Lizenz-Angaben siehe http://creativecommons.org/licenses/by/4.0/.


Gliederung

Text

Introduction: Diagnostic clinical decision support systems (CDSS) integrate and present processed information to healthcare professionals at the point of care so that they can make an appropriate diagnosis [1], [2], [3], [4]. CDSS have a medical purpose; thus, they must be safe and effective [5], [6]. They should be evaluated in a diagnostic test accuracy (DTA) study using DTA indices (e.g., sensitivity and specificity) [3], [4] and they should be reported in accordance with STARD [7]; decision makers should critically appraise the study’s risk of bias with QUADAS-2 [8]. This systematic review assesses reporting quality and risk of bias in studies evaluating DTA of CDSS.

Methods: This study was registered with PROSPERO (CRD42021226570) including the study protocol. We searched PubMed/MEDLINE, Scopus, Web of Science, and the Cochrane Library for studies evaluating DTA of CDSS for use in human patients (irrespective of the target diagnosis). Studies had to be published between 2016-01-01 (i.e., after the update of STARD) and 2021-05-31 (date of database search). We excluded studies focusing on treatment, disease follow-up, severity, or economic evaluation, using a patient’s self-diagnosis or comparing pre-post CDSS implementation periods. We assessed reporting quality (using STARD 2015 with 30 items) and risk of bias (using QUADAS-2 with 14 items) of each eligible study. We rated each item as “reported transparently/low risk of bias”, “reported semi-transparently/unclear risk of bias” or “not reported/ high risk of bias” and plotted the items’ ratings stratified by year of publication. Additionally, we conducted stratified analyses according to the CDSS development phase, the subjectivity to reporting guideline policies, and the author’s expertise. This abstract is reported according to PRISMA-DTA for abstracts [9].

Results: We included 158 of 2,820 screened studies. The studies differed in terms of sample size (11 to 1.4 million individuals), data collection (2 months to 40 years), study designs, and other study characteristics. Seventy-nine percent of the studies presented a Phase-I/II CDSS. Studies differed in their reporting quality and their risk of bias with few studies at either end of the quality continuum. Applicability of findings was less concerning. The overall reporting quality and risk of bias was mostly affected by items related to participant selection, study design, flow and timing, blinding, test methods, test results, and other information (e.g., registration, study protocol, funding). Time trends were not observed. Effects of the CDSS development phase and the subjectivity to reporting guideline adherence were not apparent, while studies involving researchers from interdisciplinary scientific fields had an improved overall quality.

Discussion: Our systematic review included all studies evaluating DTA of a CDSS, of which some may have little resemblance to a ‘true’ DTA study. However, this range of studies showed that the reporting quality and risk of bias assessment is heterogeneous, particular for items related to ‘methods’, ‘results’, and ‘other information’; thus, uncertainties in research findings remain.

Conclusion: Reporting deficiencies need to be addressed by implementing incentives and measures to increase transparency and reduce risk of biases.

Funding: This work was supported by the German Federal Ministry of Health [grant number 252DAT66A].

The authors declare that they have no competing interests.

The authors declare that an ethics committee vote is not required.


References

1.
Berner ES, La Lande TJ. Overview of clinical decision support systems. In: Berner ES, editor. Clinical decision support systems. 2nd ed. New York: Springer Science+Buisness Media; 2007. p. 3–22. DOI: 10.1007/978-0-387-38319-4_1 Externer Link
2.
Sutton RT, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI. An overview of clinical decision support systems: benefits, risks, and strategies for success. npj Digit Med. 2020;3(1):1–10. DOI: 10.1038/s41746-020-0221-y Externer Link
3.
Wasylewicz ATM, Scheepers-Hoeks AMJW. Clinical decision support systems. In: Kubben P, Dumontier M, Dekker A, editors. Fundamentals of clinical data science. Cham: Springer; 2018. p. 153–69. DOI: 10.1007/978-3-319-99713-1_11 Externer Link
4.
Miller RA, Geissbuhler A. Diagnostic Decision Support Systems. In: Berner ES, editor. Clinical decision support systems. 2nd ed. New York: Springer Science+Buisness Media; 2007. p. 99–125. DOI: 10.1007/978-0-387-38319-4_5 Externer Link
5.
European Medicines Agency. Medical devices [Internet]. [cited 2022 Feb 9]. Available from: https://www.ema.europa.eu/en/human-regulatory/overview/medical-devices Externer Link
6.
Regulation (EU) 2017/ 745 of the European Parliament and of the Council - of 5 April 2017 - on medical devices, amending Directive 2001/ 83/ EC, Regulation (EC) No 178/ 2002 and Regulation (EC) No 1223/ 2009 and repealing Council Directives 90/ 385/ EEC an. 2017. OJ L 117. Available from: http://data.europa.eu/eli/reg/2017/745/oj Externer Link
7.
Cohen JF, Korevaar DA, Altman DG, Bruns DE, Gatsonis CA, Hooft L, et al. STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration. BMJ Open. 2016;6(11):e012799. DOI: 10.1136/BMJOPEN-2016-012799 Externer Link
8.
Whiting PF, Rutjes AWS, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, et al. QUADAS-2: A revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155(8):529–36. DOI: 10.7326/0003-4819-155-8-201110180-00009 Externer Link
9.
Cohen JF, Deeks JJ, Hooft L, Salameh JP, Korevaar DA, Gatsonis C, et al. Preferred reporting items for journal and conference abstracts of systematic reviews and meta-analyses of diagnostic test accuracy studies (PRISMA-DTA for Abstracts): checklist, explanation, and elaboration. BMJ. 2021;372. DOI: 10.1136/BMJ.N265 Externer Link