gms | German Medical Science

13. Grazer Konferenz – Teaching Medicine – an Interprofessional Agenda

24. - 26.09.2009, Innsbruck, Österreich

Deriving constructional rules for a Knowledge on Skills Test


  • corresponding author Michaela Wagner-Menghin - Medical University of Vienna, Core Unit for Medical Education, Vienna, Austria
  • author Ingrid Preusche - Medical University of Vienna, Core Unit for Medical Education, Vienna, Austria
  • author Joachim Punter - Medical University of Vienna, Core Unit for Medical Education, Vienna, Austria
  • author Michael Schmidts - Medical University of Vienna, Core Unit for Medical Education, Vienna, Austria

13. Grazer Konferenz - Qualität der Lehre: Teaching Medicine – an Interprofessional Agenda. Innsbruck, Österreich, 24.-26.09.2009. Düsseldorf: German Medical Science GMS Publishing House; 2009. Doc09grako23

doi: 10.3205/09grako23, urn:nbn:de:0183-09grako234

Published: December 14, 2009

© 2009 Wagner-Menghin et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( You are free: to Share – to copy, distribute and transmit the work, provided the original author and source are credited.


Clinical skills are usually not the main target of MC-choice exams, but MC-assessment and OSCEs have already been combined successfully. Detailed MC-item writing guides usually focus on technical aspects and suggest content specific templates for basic science and clinical problem solving. However, we could not find templates for “knowledge on skills” (KOS) items. This study investigates the application of available item writing guidelines when constructing MC-items for a KOS-test and proposes templates for constructing MC-items for a KOS-test.

Method: Junior teachers (basic clinical procedures + physical examination) with a minimum of one year teaching experience, but without previous experience in MC-item writing were assigned to a 2hr item writing workshop, and invited to submit MC-items. Items were reviewed by experts and peers (criteria: practicing in the skills lab is sufficient for answering the item, relevance of content in light of the learning objectives, plausibility of intended correct answer, plausibility of distracters, formal characteristics).

Results: basic clinical procedures: submitted items= 85, accepted items=37; physical examination: submitted items=59, accepted items= 32. Ten different constructions rules could be derived, eight of them regarded relevant for a KOS-test.

Discussion: As only about 50% of submitted MC-items have been accepted, which did not meet our expectations as in other fields an acceptance rate of 80% was experienced, we conclude that available templates are not optimal for constructing MC-items for a KOS-test. The constructional rules we derived out of the submitted KOS-material further discussion whether some topics are better assessed with other types of items. They also help to identify topics not covered by the accepted MC-item pool. Further research should focus on increasing authors’ eciency by providing content specific templates for MC-items for a KOS-test.


Kramer AW, Jansen JJ, Zuithoff P, Düsman H, Tan LH, Grol RP, van der Vleuten CP. Predictive validity of a written knowledge test of skills for an OSCE in postgraduate training for general practice. Med Educ. 2002;36(9):812-819. DOI:10.1046/j.1365-2923.2002.01297.x External link
Verhoeven BH, Hamers J, Scherpbier A, Hoogenboom RJ, van der Vleuten CP. The effect on reliability of adding a separate written assessment component to an objective structured clinical examination. Med Educ. 2000;34(7):525-529. DOI:10.1046/j.1365-2923.2000.00566.x External link
Case S, Swanson DB. Constructing written test questions for the basic and clinical sciences. Philadelpia: National Board of Medical Examiners; 1998.
Smolle J. Klinische MC-Fragen rasch und einfach erstellen. Ein Praxisleitfaden für Lehrende. Berlin: DeGruyter; 2008.
Schmidts M. personal communication. 2009.