gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

A simulation-based OSCE with case presentation and remote rating – development of a prototype

article assessment

Search Medline for

  • author Lisa Bußenius - Universitätsklinikum Hamburg-Eppendorf, III. Medizinische Klinik, Hamburg, Germany
  • corresponding author Sigrid Harendza - Universitätsklinikum Hamburg-Eppendorf, III. Medizinische Klinik, Hamburg, Germany

GMS J Med Educ 2023;40(1):Doc12

doi: 10.3205/zma001594, urn:nbn:de:0183-zma0015945

This is the English version of the article.
The German version can be found at:

Received: October 6, 2022
Revised: December 14, 2022
Accepted: January 10, 2023
Published: February 15, 2023

© 2023 Bußenius et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at


Simulation-based examination formats improve the possibility to assess medical students’ competences during their performance. Additionally, videotaping of simulations allows for remote rating, providing advantages for raters, students, and exam organizers. We describe a simulation-based OSCE prototype with remote rating of students’ competences, developed to replace a conventional OSCE at Hamburg Medical Faculty. The assessment consists of two phases: a consultation phase with four simulated patient encounters and a case presentation phase where four students present two cases each. All encounters from the consultation and the presentation phase are to be videotaped and remotely rated by clinical raters. Advanced medical students (year 4) are to be assessed regarding their clinical knowledge as well as physician-patient-communication, clinical reasoning competence, and patient management competence. We provide detailed schedules for the simulation-based OSCE procedure and a roster for organization. When piloting the assessment, we encountered two major obstacles with respect to legal obligations regarding examination time and videotaping which allowed us to provide tips on how to successfully implement this assessment prototype. Remote rating will, when successfully implemented, help students to concentrate on their consultation or presentation tasks, reduce raters’ time constraints and also allow for randomized rating. Using established instruments for competence-rating rather than OSCE checklists provides an additional feature for this simulation-based OSCE prototype. Legal issues can be avoided by using the prototype for formative assessment but should be addressed in advance when it is planned to be used as summative assessment.

Keywords: assessment, competence, OSCE, remote rating, simulation

1. Background

With the National Competence Based Catalogue of Learning Objectives for Undergraduate Medical Education [1], medical curricula in Germany become increasingly competence-based. This subsequently requires the development of competence-based assessment formats. Currently, objective structured clinical examinations (OSCEs) – checklist-based assessments in the presence of raters – are used to assess medical students’ skills [2]. To assess competences, simulation-based assessments with a more holistic rating approach can include greater task complexity in more authentic settings to allow expression of performance [3]. Additionally, videotaped simulations allow for remote rating to reduce raters’ influences on students’ performance [4]. Time-independent rating also allows for flexible work scheduling, which can reduce OSCE-related logistical issues and costs [5]. To assess advanced medical students’ competences in a summative exam, we developed a prototypical simulation-based OSCE including remote rating options.

2. Objectives

Our main goal was to replace a traditional OSCE by a simulation-based OSCE to allow assessment of students’ competences. To design an OSCE prototype for this purpose we used an evidence-based approach [6]. Our target group were advanced medical students (year 4), who had acquired knowledge, skills, and attitudes needed to perform in a simulated setting.

3. Developing the prototype

For our project, we chose an OSCE at the end of a fourth year learning module in the integrated curriculum at the University of Hamburg [7]. At the end of this module, students must be able to communicate with patients and take detailed histories, discuss differential diagnoses based on diagnostic results, and suggest therapeutic approaches. The designated module covers medical topics around the abdomen and retroperitoneum from the clinical disciplines gastroenterology, general surgery, nephrology, and urology. The final OSCE with about 200 students takes place over two days and includes seven five-minute stations and checklists for assessment of clinical knowledge and communication skills.

The aim of the OSCE redesign was to measure students’ clinical knowledge and communications skills by assessing their physician-patient-communication competence, clinical reasoning competence, and patient management competence while covering the clinical content of the module. For this purpose, we designed four stations simulating physician-patient encounters (phase 1: consultations) and four stations simulating patient-case presentations in handover situations (phase 2: presentations) based on similar steps used in a validated competence-based simulation of a first day in hospital [6] (see figure 1 [Fig. 1]). In phase 1 (consultations), eight students rotate through four 6-minute stations in two parallel courses, A and B. The students, who are randomly assigned to their starting positions, encounter different cases from each of the four clinical disciplines in the two courses. Students’ assignments for each station include taking focussed histories from simulated patients and discussing test results with them, which are presented during the patient interview. All encounters are videotaped which allows for remote assessment. Besides the respective medical content correctness, clinical reasoning and physician-patient communication competences can be remotely assessed with specific questionnaires [8], [9]. In phase 2 (presentation), groups A and B are combined, so that two students meet in one room for a 6-minute handover. Group B rotates in the opposite direction of group A to ensure an even mix of student pairs. Every student will present (p) and receive (r) two patients so that all four disciplines are covered in this phase (e.g., in round 1, student 1 presents (p) a gastroenterology case to student 8 who receives (r) it, while in round 2 student 8 presents (p) a nephrology case to student 3 who receives (r) it). Students’ assignment includes presentation and discussion of each case to come to a management conclusion. Analogous to phase 1, all handovers will be videotaped in phase 2 which allows for remote assessment. Besides assessing students’ knowledge of medical content, the presenting and the receiving student are also assessed with regard to their communication competence [9], [10], clinical reasoning [8], and patient management [11].

With one minute for changing between the physician-patient encounters and between each handover and two minutes to change from phase 1 (consultation) to phase 2 (presentation), the simulation-based OSCE lasts 56 minutes. With overlapping phases (i.e. when group 1 enters the presentation phase, a new group can start with the consultation phase), 100 students can be examined in one day during normal working hours with reasonable breaks for the simulated patients (see figure 2 [Fig. 2]). With remote rating, no clinical raters need to be present and can perform the rating when they are off clinical duty. Every student will receive eight different ratings, four from phase 1 and four from phase 2, i.e. two from each discipline (see table 1 [Tab. 1]).

4. Pitfalls with and tips for implementation

We piloted the simulation-based OSCE as summative assessment in July 2019 with 186 students [12]. However, due to time constraints in the exam regulations of our medical school, we were only allowed 42 minutes to incorporate seven subjects into our assessment. Thus, we were not able to use our planned design with the two phases, which would have needed 56 minutes. Tip 1: if you wish to use this prototype with a consultation and a presentation phase including changing times, check with the exam regulations of your medical school whether there are limitations for exam time. In our case, we had to stick to the traditional OSCE format and cancel the presentation phase.

Furthermore, we were not allowed to film the conversations which the students held with the simulated patients for legal reasons at our medical school. Therefore, the remote rating, a main feature of assessment in our prototype, could not take place. Tip 2: if you wish to use remote rating, check with your legal department whether it is possible to take videos of students for remote assessment. Audio tapes could be an alternative, if they are allowed.

However, at least we were able to keep the core idea of our project by introducing elements of simulation into the traditional OSCE format. Students entered the examination rooms not knowing which medical content of the four clinical disciplines to expect from a certain simulated patient, which requires good history taking skills. Tip 3: Adapt the patient cases from real cases and develop them together with experienced clinicians to increase content validity [13].

Instead of video-ratings, our examiners had to be present in the room for their ratings, but did not interfere in the student-patient-communication. Furthermore, due to the exam regulations at Hamburg Medical Faculty, which require a certain amount of points per OSCE station, the raters could not use the rating instruments for clinical reasoning [8] and physician-patient communication competences [9] we had planned for the consultation phase. However, instead of just rating medical content, we were at least able to integrate clinical reasoning and communication aspects in addition to medical content into our usual OSCE checklists. As an additional educational research project, raters with a background from psychosocial fields were permitted in the examination rooms to rate the students’ communication competence with the Global Rating Scale [9]. Tip 4: Before using the suggested instruments for communication competence, clinical reasoning, and patient management, check with your local exam regulations whether they will be allowed for summative assessment.

Due to legal regulations at Hamburg Medical Faculty, we were not able to perform the simulation-based OSCE with competence-based remote rating as a substitute for our usual summative OSCE. However, parts of the simulation, i.e. consultations and case presentations including remote rating of clinical reasoning, communication competence, and patient management have been demonstrated elsewhere to be successfully performed [8], [9], [10], [11]. The suggested prototype even works as telemedicine format which we implemented in an extended format as a formative assessment with feedback [13]. Time-independent, remote rating works well for the assessors’ work schedules. Furthermore, advantages of remote rating include that the videos can be assessed in a randomized order to prevent bias caused by order effects [14], and the assessors can use breaks at their own convenience to reduce rater fatigue [15]. Furthermore, blinding students’ to specific stations’ content helps to prepare them for real consultations where the cause of a patient’s problem is unknown. Additionally, remote rating requires less personnel on-site while the assessment takes place. When considering implementation of this OSCE format, we recommend to clarify anticipated barriers such as limited assessment time or possible legal issues with filming or data storage ahead of planning.


This work was supported by the Federal Ministry of Education and Research (project number: 01GK1801A). The funding body played no role in the design of the study and in collection, analysis, and interpretation of data and in writing the manuscript.

Ethical approval

The Ethics Committee of the Hamburg Chamber of Physicians confirmed the innocuousness of this study and its congruence with the Declaration of Helsinki (WF-047/16). The anonymity of all participants was guaranteed.


We thank all participating medical students, actors and actresses, raters, and the Deanery of the Medical Faculty Hamburg for their support of piloting this assessment format.

Competing interests

The authors declare that they have no competing interests.


Fischer MR, Bauer D, Mohn K, NKLM-Projektgruppe. Finally finished! National competence based catalogues of learning objectives for undergraduate medical education (NKLM) and dental education (NKLZ) ready for trial. GMS J Med Educ. 2015;32(3):Doc35. DOI: 10.3205/zma000977 External link
Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ. 2004;38(2):199-203. DOI: 10.1111/j.1365-2923.2004.01755.x External link
Rotthoff T, Kadmon M, Harendza S. It does not have to be either or! Assessing competence in medicine should be a continuum between an analytic and a holistic approach. Adv Health Sci Educ Theory Pract. 2021;26(5):1659-1673. DOI: 10.1007/s10459-021-10043-0 External link
Yeates P, Cope N, Hawarden A, Bradshaw H, McCray G, Homer M. Developing a video-based method to compare and adjust examiner effects in fully nested OSCEs. Med Educ. 2019;53(3):250-263. DOI: 10.1111/medu.13783 External link
Vivekananda-Schmidt P, Lewis M, Coady D, Morley C, Kay L, Walker D, Hassell AB. Exploring the use of videotaped objective structured clinical examination in the assessment of joint examination skills of medical students. Arthritis Rheum. 2007;57(5):869-876. DOI: 10.1002/art.22763 External link
Prediger S, Schick K, Fincke F, Fürstenberg S, Oubaid V, Kadmon M, Berberat PO, Harendza S. Validation of a competence-based assessment of medical students' performance in the physician's role. BMC Med Educ. 2020;20:6. DOI: 10.1186/s12909-019-1919-x External link
Rheingans A, Soulos A, Mohr S, Meyer J, Guse AH. The Hamburg integrated medical degree program iMED. GMS J Med Educ. 2019;36(5):Doc52. DOI: 10.3205/zma001260 External link
Fürstenberg S, Helm T, Prediger S, Kadmon M, Berberat PO, Harendza S. Assessing clinical reasoning in undergraduate medical students during history taking with an empirically derived scale for clinical reasoning indicators. BMC Med Educ. 2020;20:368. DOI: 10.1186/s12909-020-02260-9 External link
Bußenius L, Kadmon M, Berberat PO, Harendza S. Evaluating the Global Rating scale's psychometric properties to assess communication skills of undergraduate medical students in video-recorded simulated patient encounters. Patient Educ Couns. 2022;105(3):750-755. DOI: 10.1016/j.pec.2021.06.001 External link
Gärtner J, Bußenius L, Schick K, Prediger S, Kadmon M, Berberat PO, Harendza S. Validation of the ComCare index for rater-based assessment of medical communication and interpersonal skills. Patient Educ Couns. 2022;105(4):1004-1008. DOI: 10.1016/j.pec.2021.07.051 External link
Brätz J, Bußenius L, Brätz I, Grahn H, Prediger S, Harendza S. Assessment of final-year medical students’ entrustable professional activities after education on an interprofessional training ward: A case-control study. Perspect Med Educ. 2022;11:266-272. DOI: 10.1007/s40037-022-00720-0 External link
Bußenius L, Harendza S. Are different medical school admission tests associated with the outcomes of a simulation-based OSCE? BMC Med Educ. 2021;21:263. DOI: 10.1186/s12909-021-02703-x External link
Harendza S, Gärtner J, Zelesniack E, Prediger S. Evaluation of a telemedicine-based training for final-year medical students including simulated patient consultations, documentation, and case presentation. GMS J Med Educ. 2020;37(7):Doc94. DOI: 10.3205/zma001387 External link
Burt J, Abel G, Elmore N, Campbell J, Roland M, Benson J, Silverman J. Assessing communication quality of consultations in primary care: initial reliability of the Global Consultation Rating Scale, based on the Calgary-Cambridge Guide to the Medical Interview. BMJ Open. 2014;4:e004339. DOI: 10.1136/bmjopen-2013-004339 External link
McLaughlin K, Ainslie M, Coderre S, Wright B, Violato C. The effect of differential rater function over time (DRIFT) on objective structured clinical examination ratings. Med Educ. 2009;43(10):989-992. DOI: 10.1111/j.1365-2923.2009.03438.x External link