gms | German Medical Science

4th Research in Medical Education (RIME) Symposium 2015

19.03-21.03.2015, München

Developing an alternative response format for the script concordance test

Meeting Abstract

Suche in Medline nach

4th Research in Medical Education (RIME) Symposium 2015. München, 19.-21.03.2015. Düsseldorf: German Medical Science GMS Publishing House; 2015. DocRD22

doi: 10.3205/15rime67, urn:nbn:de:0183-15rime676

Veröffentlicht: 12. März 2015

© 2015 Lahner et al.
Dieser Artikel ist ein Open Access-Artikel und steht unter den Creative Commons Lizenzbedingungen. Er darf vervielfältigt, verbreitet und öffentlich zugänglich gemacht werden, vorausgesetzt dass Autor und Quelle genannt werden. Lizenz-Angaben siehe



Introduction: Clinical reasoning is essential for the practice of medicine. In theory of development of medical expertise it is stated, that clinical reasoning starts from analytical processes namely the storage of isolated facts and the logical application of the ‘rules’ of diagnosis. Then the learners successively develop so called semantic networks and illness-scripts which finally are used in an intuitive non-analytic fashion [1], [2].

The script concordance test (SCT) is an example for assessing clinical reasoning [3]. However the aggregate scoring [3] of the SCT is recognized as problematic [4]. The SCT`s scoring leads to logical inconsistencies and is likely to reflect construct-irrelevant differences in examinees’ response styles [4]. Also the expert panel judgments might lead to an unintended error of measurement [4].

In this PhD project the following research questions will be addressed:

How does a format look like to assess clinical reasoning (similar to the SCT but) with multiple true-false questions or other formats with unambiguous correct answers, and by this address the above mentioned pitfalls in traditional scoring of the SCT?
How well does this format fulfill the Ottawa criteria for good assessment, with special regards to educational and catalytic effects [5]?


In a first study it shall be assessed whether designing a new format using multiple true-false items to assess clinical reasoning similar to the SCT-format is arguable in a theoretically and practically sound fashion. For this study focus groups or interviews with assessment experts and students will be undertaken.
In an study using focus groups and psychometric data Norcini`s and colleagues Criteria for Good Assessment [5] shall be determined for the new format in a real assessment. Furthermore the scoring method for this new format shall be optimized using real and simulated data.


Schuwirth L. Is assessment of clinical reasoning still the Holy Grail? Med Educ. 2009;43(4):298–300. DOI: 10.1111/j.1365-2923.2009.03290.x Externer Link
Norman GR, Eva KW. Diagnostic error and clinical reasoning. Med Educ. 2010;44(1):94-100. DOI: 10.1111/j.1365-2923.2009.03507.x Externer Link
Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice: AMEE guide no. 75. Med Teach. 2013;35(3):184-193. DOI: 10.3109/0142159X.2013.760036 Externer Link
Lineberry M, Kreiter CD, Bordage G. Threats to validity in the use and interpretation of script concordance test scores. Med Educ. 2013;47(12):1175-1183. DOI: 10.1111/medu.12283 Externer Link
Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith R, Hays R, Kent A, Perrott V, Roberts T. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206-214. DOI: 10.3109/0142159X.2011.551559 Externer Link