gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

“Fit for the finals” – project report on a telemedical training with simulated patients, peers, and assessors for the licensing exam

article telemedicine

  • corresponding author Sigrid Harendza - Universitätsklinikum Hamburg-Eppendorf, III. Medizinische Klinik, Hamburg, Germany
  • Lisa Bußenius - Universitätsklinikum Hamburg-Eppendorf, III. Medizinische Klinik, Hamburg, Germany
  • Julia Gärtner - Universitätsklinikum Hamburg-Eppendorf, III. Medizinische Klinik, Hamburg, Germany
  • Miriam Heuser - Albert-Ludwigs-Universität Freiburg, Medizinische Fakultät, Studiendekanat, Freiburg, Germany
  • Jonathan Ahles - Albert-Ludwigs-Universität Freiburg, Medizinische Fakultät, Studiendekanat, Freiburg, Germany
  • Sarah Prediger - Universitätsklinikum Hamburg-Eppendorf, III. Medizinische Klinik, Hamburg, Germany

GMS J Med Educ 2023;40(2):Doc17

doi: 10.3205/zma001599, urn:nbn:de:0183-zma0015997

This is the English version of the article.
The German version can be found at:

Received: November 6, 2022
Revised: January 11, 2023
Accepted: February 6, 2023
Published: April 17, 2023

© 2023 Harendza et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at


Background: Undergraduate medical students take the licensing exam (M3) as a two-day oral-practical examination. The main requirements are to demonstrate history taking skills and coherent case presentations. The aim of this project was to establish a training in which students can test their communication skills during history taking and their clinical reasoning skills in focused case presentations.

Methods: In the newly developed training, final-year students took four telemedical histories in the role of physicians from simulated patients (SP). They received further findings for two SPs and presented these in a handover, in which they also received a handover of two SPs which they had not seen themselves. Each student presented one of the two received SPs in a case discussion with a senior physician. Feedback was given to the participants on their communication and interpersonal skills by the SPs with the ComCare questionnaire and on the case presentation by the senior physician. Sixty-two students from the universities of Hamburg and Freiburg in their final year participated in September 2022 and evaluated the training.

Results: Participants felt that the training was very appropriate for exam preparation. The SPs' feedback on communication and the senior physician's feedback on clinical reasoning skills received the highest ratings in importance to the students. Participants highly valued the practice opportunity for structured history taking and case presentation and would like to have more such opportunities in the curriculum.

Conclusion: Essential elements of the medical licensing exam can be represented, including feedback, in this telemedical training and it can be offered independent of location.

Keywords: exam, formative testing, simulation, telemedicine, training

1. Introduction

Regardless of their completion in a regular or a model curriculum, undergraduate medical studies in Germany end, according to the licensing regulations (ÄApprO), with the final licensing exam (M3) (§ 30 ÄApprO, see []). This is an oral-practical exam that takes place on two days. The first day of the exam is reserved for the practical examination with patient presentation (§ 30 para. 1 ÄApprO). In the M3 exam, the examinee has to show that s*he knows how to apply the knowledge acquired during his or her studies in practice, that s*he has the necessary skills and abilities to conduct medical conversations (§ 30 Abs. 3 ÄApprO) and that s*he knows how to behave in accordance with the general rules of medical behavior towards the patient (§ 30 Abs. 3 S. 10 ÄApprO). In particular, the examinee should demonstrate that s*he has mastered the technique of taking a medical history and of basic laboratory methods and is able to assess their results, is able to obtain and request the required information to make a diagnosis, to recognize the different significance and weighting for making a diagnosis and to critically utilize it within the frame of differential diagnostic considerations (§ 30 para. 3 p. 2-3 ÄApprO). This process is referred to as clinical reasoning [1], [2], [3] and represents the basis of medical thinking and action. The assessors for the M3 exam are appointed by the state examination office of the respective federal state on the recommendation of the universities and are usually at least board certified medical specialists. Their tasks include ensuring that the exam is conducted in accordance with the ÄApprO, grading the exam, and documenting the content of it.

During undergraduate medical training, the opportunity to prepare for this type of oral-practical exam is very limited, as the majority of exams during the semesters are multiple-choice or objective structured clinical examinations (OSCEs), even in relatively newly established universities [4]. Also, there are only a few published teaching formats in German-language medical curricula that support the explicit learning and practice of clinical reasoning [5], [6], [7], [8], [9], [10], [11], [12], [13], although the introduction of clinical reasoning in medical curricula is explicitly demanded in the European higher education area [14] and a standard work – meanwhile in its second edition [15] – as well as further didactic instructions for teaching clinical reasoning have been available internationally since 1991 [16], [17]. An explanatory model for clinical reasoning as well as for many other decision processes is the so-called dual-process theory [18], [19]. While the intuitive way of thinking is applied, for example, in multiple-choice questions and is, therefore, learned implicitly [20], the analytical way of thinking, if not explicitly taught, can be observed, for example, through the behavior of physician role models in case discussions when they justify their working hypotheses and further diagnostic or therapeutic steps. That these two ways of thinking are used intermittently in everyday medical practice and typical cognitive errors occur in both, intuitive and analytical thinking, has also been studied [21], [22]. The use of clinical reasoning can be assessed during history taking [23], [24] as well as during case presentations [5], [25].

Some medical faculties and also commercial companies or medical professional associations offer seminars to prepare for the licensing exam in order to familiarize oneself with the specific circumstances of the examination situation. However, there is usually no particular focus on physician-patient communication and clinical reasoning. However, physician-patient communication is an essential component in the parts of the M3 exam that take place with patients, and clinical reasoning is a crucial prerequisite for focused case presentations and discussions, which is an important part of the M3 exam, both at the bedside and without patient participation in additional cases. Therefore, the aim of this project was to develop a training that students can undergo towards the end of their final year in order to test their communication skills in while taking focused histories and their clinical reasoning skills for the focused presentation of patients as well as to receive feedback on this. This is intended to enable final-year students to prepare for the oral-practical exam in a way that is more tailored to their needs and oriented on the expression of their own competences in these two areas.

2. Project description

In 2020, we developed a competence-based telemedicine training for undergraduate medical students in their final year at the Center for the Development and Assessment of Medical Competences at the University Medical Center Hamburg-Eppendorf [26]. This training included a telemedical consultation with four simulated patients per participant, patient documentation and ordering of further diagnostics using an electronic patient file, as well as a case presentation per participant in a digital case discussion with a senior physician. It represents a development towards telemedicine of two previous projects, where we developed and validated a training format for a simulated first day of work as a physician based on essential facets of competence for physicians at the beginning of their postgraduate training [27], [28], [29], [30]. The previously established telemedicine training format [26] was redesigned for the “Fit for the finals” training as follows (see figure 1 [Fig. 1]).

All participants had received a written briefing for the training on the content and technical procedure in advance, including further documents from the UKE clinical reasoning course [5] for focused history taking and case presentation with reasoning. The main aspects were repeated in a personal briefing by the organizer of the training and the participants had the opportunity to ask questions. Analogous to the previous telemedicine-based training [26], a telemedical consultation hour with four simulated patients per participant took place in the first phase (consultation hour). Eight students per round participated in the training at the same time (group A and group B), whereby the patient cases for group A and B were different. Figure 2 [Fig. 2] shows a simulated patient in the telemedicine setting with tablet; a total of eight tablets were required. All patient cases were designed according to real patients from the emergency department of the University Medical Center Hamburg-Eppendorf and included internal and surgical diseases that are frequently assessed in the final exam. Furthermore, in addition to a chief complaint, all patient cases were again designed with a personal situation that presented a communicative challenge [26]. The roles were played by professional actors and actresses who were specially selected for the respective roles and trained by SH and SP for the physician-patient interviews and the completion of the evaluation forms (see attachment 1 [Attach. 1]). Each interview was scheduled for a maximum of 10 minutes. All interviews were recorded on video. The participants were provided with the corresponding findings of the physical examination after each encounter with the simulated patients, and during the five minutes until the next interview the participants could think about the previous case including the physical finding. The simulated patients electronically completed the ComCare questionnaire after each interview, a validated instrument for measuring communicative and interpersonal skills [31], [32], which contains open and closed questions. Eight tablets were also required for this purpose. The participants received the results of these questionnaires with the quantitative evaluation of the items as well as their personal feedback after the end of the training.

In a second phase (case preparation) after the telemedical consultation hour, the participants received further findings for two of their four patients, e.g. laboratory results, ECG, X-rays or other findings. They were also given an electronic form for each of the two patients in which they should document several differential diagnoses to structure the case presentation. For each differential diagnosis, participants were asked to document in two boxes (“confirming aspects” and “disconfirming aspects”) the information that made the respective differential diagnosis more or less likely based on the patient's history, the physical examination, and the additional diagnostic findings. This electronic form was modeled after a virtual patient program for clinical reasoning training [33]. Finally, participants were asked to document the working diagnosis they wanted to hand the two cases over with. They were also asked to use a slider (from “very uncertain” to “very certain”) to indicate how certain they felt with each working diagnosis after weighting the differential diagnoses by arguments (see figure 3 [Fig. 3]).

In the third phase (handover), the participants of group A reported to the participants of group B in different rounds on one of their two patients and vice versa, whereby for each conversation the participants were virtually shifted in such a way that they met new dialog partners each time. Four laptops were required for this purpose. With this approach, the respective receiving person took on the role of an assessor in these peer case handovers. Their task was to understand the received case and to discuss it with the person presenting it, in order to be able to present one of the received cases later in a structured way by themselves. This procedure was intended to simulate the situation of an actual handover and thus, at the same time, to focus attention on the essential aspects of a case. In the briefing text, all participants had received an example of a focused case presentation showing how to use the confirming and disconfirming aspects from the electronic form for clinical reasoning in weighing differential diagnoses. Six minutes were available for each handover. The case handovers were also video recorded.

In the fourth phase (case presentation and discussion), all eight participants of groups A and B met digitally with a senior physician via laptop. The participants were informed beforehand which of the two received patients they had to present. The patients were called up individually and the participants then had ten minutes to introduce each patient, discuss them with the senior physician and the peers, look at essential findings (e.g. ECGs or X-rays) together, and medically solve the cases with regard to further needed diagnostics and therapy. In addition, the participants received feedback on their clinical reasoning from the senior physician. Finally, a debriefing of the training took place with the eight participants of each round as a group discussion. These two phases were also videotaped. The discussion was transcribed verbatim for the evaluation of the students' contributions.

In September 2022, a total of 62 students (47 from the University of Hamburg and 15 from the University of Freiburg) participated in the “Fit for the finals” training over two days shortly before completing their final year. Their mean age was 27.6±3.7 years. Of the 62 participants, 80.6% were female, 19.4% male. Students had been informed via digital bulletin boards or email of the opportunity to participate in this voluntary training, and registration was on a first-come, first-served basis. For logistical reasons, the invitation to the Hamburg students was issued two weeks earlier than the invitation to the Freiburg students. For the scientific monitoring of this project, an approval of the Ethics Committee of the Hamburg Chamber of Physicians was obtained (reference number: PV3649) and the students consented to participate in writing. The participants received a digital questionnaire to evaluate the training after the debriefing in which they answered questions about their own experiences during the training, about the training as a whole as well as about its phases and the organization of the training on a 5-point Likert scale (1: does not apply, 2: does rather not apply, 3: partly/partly, 4: rather applies, 5: fully applies).

3. Results

In the assessment of their communicative and interpersonal skills by the simulated patients with the ComCare, the students received on average 4.15±0.45 points of a maximum of 5 points (see table 1 [Tab. 1]). In particular, “use of comprehensible language” (4.71±0.32), “comprehensible explanation of next diagnostic and therapeutic steps” (4.41±0.41), and “attentive listening” (4.29±0.52) were rated highest by the simulated patients. The item “the physician was interested in me as a person and in my environment” received the lowest rating of 3.45±0.67 from the perspective of the simulated patients.

After experiencing the training situation, the students rated themselves most confident in dealing with the patients (4.08±0.86) and least confident in clinical reasoning (3.31±0.88) (see table 2 [Tab. 2]). They considered the patient cases very useful for practicing differential diagnostic thinking and the interviews with the simulated patients for practicing focused history taking (4.85±0.41 and 4.76±0.50, respectively). Feedback from the simulated patients on their own communication skills and feedback from the teacher on the presentation of a patient case were very important to participants (4.79±0.49 and 4.90±0.31, respectively). In the free text comments of the evaluation, the constructive feedback of the simulated patients and the teacher as well as the variety and depth of the real patient cases were also mentioned to be essential aspects of the training. In addition, the open learning atmosphere and the role change into the position of an assessor (as a receiver of a handover) were found to be helpful. The debriefing groups revealed that the participants had recognized essential principles of clinical reasoning for themselves: “[...] as long as you can justify your decisions [...] everything is okay”, “[...] you don't need to stressed out, if you don't know something, just explain your ideas” and “if [your concept] is not conclusive, you [have] to question it”. In addition, it turned out that for many participants there had apparently been little opportunity to practice structured case presentations during the final year or there were also concerns about not receiving adequate feedback (“unfortunately, I never had my own patients [to] deliver a structured presentation”; “I missed real professional exchange and confident answers from senior physicians very much”; “[...] depending on the person leading the ward rounds, you might think twice [whether to present a patient or not], because it might sometimes be difficult with some personalities”).

Overall, the participants rated the “Fit for the finals” training with a school grade of 1.2±0.41. They were very satisfied with the organizational communication and processes (4.76±0.50), as well as with the technical processes of the training (4.25±0.88). They considered the training to be very suitable as preparation for the M3 exam (4.56±0.65) and would recommend the course to their fellow students (4.88±0.33). Reasons given for recommending it to others included “[...] because you gain confidence and realize that you don't have to know everything”; “[...] because you take on the role of the receiver and then the presenter”; “[...] because mistakes are not seen as a problem but as an opportunity to learn the systematic approach”.

4. Discussion and conclusion

The evaluations of the simulated patients and the feedback of the students show that the two main goals of the training, to provide the participants with feedback on their communication competence during history taking and on their clinical reasoning competence, were achieved. The redesigned training for the M3 exam allowed participants to test their own communication and clinical reasoning skills, so that they think, with the appropriate feedback they can adapt the preparations for their oral-practical exam to their needs. As the results of the ComCare questionnaire show, the students achieved good ratings in the communicative aspects, while there is still room for improvement in some interpersonal aspects. The participants of the training found the additional feedback from the simulated patients very helpful. This is consistent with findings that feedback from simulated patients helps to improve students' communication skills [34]. The participants of the training also experienced the interaction with the simulated patients themselves as useful for their own learning due to the authenticity of the cases. That interactions with simulated patients also contribute to professional development already during the interaction was shown in another study [35].

By changing roles and thus perspectives from presenting to receiving and back to presenting a case again, participants reported experiencing essential aspects of clinical reasoning in discussion with their peers in terms of focusing and reasoning as is required when presenting patient cases in the oral-practical exam. A meta-analysis on feedback students received on their clinical performance during examinations showed highly variable and, in some cases, poorly beneficial results with regard to the usefulness of this feedback for their own learning and personal development [36]. Peer feedback within the case discussion phase of our training was found to be very useful in improving one's case presentation skills, especially due to the change in roles. The teacher's feedback on the patient presentation and clinical reasoning process was also very important to the participants for their own learning, as there had apparently been little opportunity for many participants to practice clinical reasoning and case presentations with feedback during their studies. With appropriate teacher training on clinical reasoning [37], it should be relatively easy, with reasonable effort, to provide students with learning opportunities on clinical reasoning and case presentation in other phases of their studies, so that they could use a training such as the one in this project even more effectively for self-assessment of their skills. However, various aspects have been identified that stand in the way of implementing a longitudinal clinical reasoning curriculum [38] and need to be considered individually at different study locations in order to successfully implement clinical reasoning. Should the implementation of a clinical reasoning curriculum prove difficult, at least regular feedback from teachers or even peers seems to be helpful for learning communication and other clinical skills [39], [40], [41].

Even though only a small sample of 62 voluntary students from two medical schools participated in the training in a first run, it could already be shown that the intended learning objectives were achieved from the participants' point of view. It can be assumed that these results can also be transferred to a larger sample. By its format and with appropriate feedback, the training helps students to reflect on their personal skills with regard to communication and clinically well-argued case presentation, to identify possible deficits, and thus, from their perspective, to better set their own priorities in preparation for the oral-practical exam. As other elements of the training that have not been used so far, an individual analysis of the history taking and case discussion videos with individual feedback by lecturers or peers would be possible. Due to the telemedical training approach, the training can be very easily offered nationwide and independent of location, as demonstrated in this study.


This project was supported by the Joachim Herz Foundation.


This project was conducted in accordance with the Declaration of Helsinki and the Ethics Committee of the Hamburg Chamber of Physicians approved the study and confirmed its innocuousness. The project included a written consent of the participants for study participation including digital recording and a retention of all collected records for at least ten years and the participation was voluntary and anonymized (reference number: PV3649).


We thank the medical students of the Universities of Hamburg and Freiburg who participated in the training and the actresses and actors Theresa Berlage, Jantje Billker, Christian Bruhn, Claudia Claus, Christiane Filla, Uwe Job, Thomas Klees, Frank Thomé. Many thanks for the photograph (figure 2) to Axel Kirchhof.

Competing interests

The authors declare that they have no competing interests.


Elstein AS, Schwartz A. Clinical problem solving and diagnostic decision making: selective review of the cognitive literature. BMJ. 2002;324(7339):729-732. DOI: 10.1136/bmj.324.7339.729 External link
Durning S, Artino AR Jr, Pangaro L, van der Vleuten CP, Schuwirth, L. Context and clinical reasoning: understanding the perspective of the expert's voice. Med Educ. 2011;45(9):927-938. DOI: 10.1111/j.1365-2923.2011.04053.x External link
Kassirer JP, Wong JB, Kopelman RI. Learning Clinical Reasoning. 2nd ed. Baltimore: Lippincott Williams & Wilkins Health; 2009.
Winkelmann A, Schendzielorz J, Maske D, Arends P, Bohne C, Hölzer H, Harre K, Nübel J, Otto B, Oess S. The Brandenburg reformed medical curriculum: study locally, work locally. GMS J Med Educ. 2019;36(5):Doc49. DOI: 10.3205/zma001257 External link
Harendza S, Krenz I, Klinge A, Wendt U, Janneck M. Implementation of a Clinical Reasoning Course in the Internal Medicine trimester of the final year of undergraduate medical training and its effect on students’ case presentation and differential diagnostic skills. GMS J Med Educ. 2017;34(5):Doc66. DOI: 10.3205/zma001143 External link
Koenemann N, Lenzer B, Zottmann JM, Fischer MR, Weidenbusch M. Clinical Case Discussions – a novel, supervised peer-teaching format to promote clinical reasoning in medical students. GMS J Med Educ. 2020;37(5):Doc48. DOI: 10.3205/zma001341 External link
Braun LT, Borrmann KF, Lottspeich C, Heinrich DA, Kiesewetter J, Fischer MR, Schmidmaier R. Scaffolding clinical reasoning of medical students with virtual patients: effects on diagnostic accuracy, efficiency, and errors. Diagnosis (Berl). 2019;6(2):137-149. DOI: 10.1515/dx-2018-0090 External link
Klein M, Otto B, Fischer MR, Stark R. Fostering medical students‘ clinical reasoning by learning from errors in clinical case vignettes: effects and conditions of additional prompting procedures to foster self-explanations. Adv Health Sci Educ Theory Pract. 2019;24(2):331-351. DOI: 10.1007/s10459-018-09870-5 External link
Djermester P, Gröschke C, Gintrowicz R, Peters H, Degel A. Bedside teaching without bedside – an introduction to clinical reasoning in COVID-19 times. GMS J Med Educ. 2021;38(1):Doc14. DOI: 10.3205/zma001410 External link
Zottmann JM, Horrer A, Chouchane Am Huber J, Heuser S, Iwaki L, Kowalski C, Gartmeier M, Berberat PO, Fischer MR, Weidenbusch M. Isn’t here just there without a “t” – to what extent can digital Clinical Case Discussions compensate for the absence of face-to-face teaching? GMS J Med Educ. 2020;37(7):Doc99. DOI: 10.3205/zma001392 External link
Hege I, Kononowicz A, Kiesewetter J, Foster-Johnson L. Uncovering the relation between clinical reasoning and diagnostic accuracy – an analysis of learner’s clinical reasoning process in virtual patients. PLoS One. 2018;13(10):e0204900. DOI: 10.1371/journal.pone.0204900 External link
Schuelper N, Ludwig S, Anders S, Raupach T. The impact of medical students’ individual teaching format choice on the learning outcome related to clinical reasoning. JMIR Med Educ. 2019;5(2):e13386. DOI: 10.2196/13386 External link
Middeke A, Anders S, Schuelper M, Raupach T, Schuelper N. Training of clinical reasoning with a Serious Game versus small-group problem-based learning: a prospective study. PLoS One. 2018;13(9):e0203851. DOI: 10.1371/journal.pone.0203851 External link
Parodis I, Andersson L, Durning SJ, Hege I, Knez J, Kononowicz AA, Lidskog M, Petreski T, Szopa M, Edelbring S. Clinical reasoning needs to be explicitly addressed in health professions curricula: recommendations from a European consortium. Int J Environ Res Public Health. 2021;18(21):11202. DOI: 10.3390/ijerph182111202 External link
Kassirer JP, Wong JB, Kopelman RI. Learning Clinical Reasoning. 2nd ed. Baltimore: Lippincott Williams & Wilkins Health; 2010.
Kassirer JP. Teaching clinical reasoning: case-based and coached. Acad Med. 2010;85(7):1118-1124. DOI: 10.1097/acm.0b013e3181d5dd0d External link
Schmidt HG, Mamede S. How to improve the teaching of clinical reasoning: a narrative review and a proposal. Med Educ. 2015;49(10):961-973. DOI: 10.1111/medu.12775 External link
Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online. 2011;16(1):Article 5890. DOI: 10.3402/meo.v16i0.5890 External link
Marcum JA. An integrated model of clinical reasoning: dual-process theory of cognition and metacognition. J Eval Clin Pract. 2012;18(5):954-961. DOI: 10.1111/j.1365-2753.2012.01900.x External link
Freiwald T, Salimi M, Khaljani E, Harendza S. Pattern recognition as a concept for multiple-choice questions in a national licensing exam. BMC Med Educ. 2014;14:232. DOI: 10.1186/1472-6920-14-232 External link
Norman G, Monteiro S, Sherbino J. Is clinical cognition binary or continuous? Acad Med. 2013;88(8):1058-1060. DOI: 10.1097/ACM.0b013e31829a3c32 External link
Norman G, Sherbino J, Dore K, Wood T, Young M, Gaissmaier W, Kreuger S, Monteiro S. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89(2):277-284. DOI: 10.1097/ACM.0000000000000105 External link
Haring CM, Cools BM, van Gurp PJM, van der Meer JWM, Postma CT. Observable phenomena that reveal medical students’ clinical reasoning ability during expert assessment of their history taking: a qualitative study. BMC Med Educ. 2017;17(1):147. DOI: 10.1186/s12909-017-0983-3 External link
Fürstenberg S, Helm T, Prediger S, Kadmon M, Berberat PO, Harendza S. Assessing clinical reasoning in undergraduate medical students during history taking with an empirically derived scale for clinical reasoning indicators. BMC Med Educ. 2020;20(1):368. DOI: 10.1186/s12909-020-02260-9 External link
Fagundes EDT, Ibiapina CC, Alvim CG, Fernandes RAF, Carvalho-Filho MC, Brand BLP. Case presentation methods: a randomized controlled trial of the one-minute preceptor versus SNAPPS in a controlled setting. Perspect Med Educ. 2020;9(4):245-250. DOI: 10.1007/s40037-020-00588-y External link
Harendza S, Gärtner J, Zelesniack, Prediger S. Evaluation of a telemedicine-based training for final-year medical students including simulated patient consultations, documentation, and case presentation. GMS J Med Educ. 2020;37(7):Doc94. DOI: 10.3205/zma001387 External link
Wijnen-Meijer M, van der Schaaf M, Nillesen K, Harendza S, Ten Cate O. Essential facets of competence that enable trust in medical graduates: a ranking study among physician educators in two countries. Perspect Med Educ. 2013;2(5-6):290-297. DOI: 10.1007/s40037-013-0090-z  External link
Fürstenberg S, Schick K, Deppermann J, Prediger S, Berberat PO, Kadmon M, Harendza S. Competencies for first year residents – physicians’ views from medical schools with different undergraduate curricula. BMC Med Educ. 2017;17(1):154. DOI: 10.1186/s12909-017-0998-9 External link
Wijnen-Meijer M, van der Schaaf M, Booij E, Harendza S, Boscardin C, van Wijngaarden J, Ten Cate Th J. An argument-based approach to the validation of UHTRUST: can we measure how recent graduates can be trusted with unfamiliar tasks? Adv Health Sci Educ Theory Pract. 2013;18(5):1009-1027. DOI: 10.1007/s10459-013-9444-x External link
Prediger S, Schick K, Fincke F, Fürstenberg S, Oubaid V, Kadmon M, Berberat PO, Harendza S. Validation of a competence-based assessment of medical students’ performance in the physician’s role. BMC Med Educ. 2020;10(1):6. DOI: 10.1186/s12909-019-1919-x External link
Gärtner J, Prediger S, Harendza S. Development and pilot test of ComCare - a questionnaire for quick assessment of communicative and social competences in medical students after interviews with simulated patients. GMS J Med Educ. 2021;38(3):Doc68. DOI: 10.3205/zma001464 External link
Gärtner J, Bußenius L, Schick K, Prediger S, Kadmon M, Berberat PO, Harendza S. Validation of the ComCare index for rater-based assessment of medical communication and interpersonal skills. Patient Educ Couns. 2022;105(4):1004-1008. DOI: 10.1016/j.pec.2021.07.051 External link
Waechter J, Allen J, Lee CH, Zwaan L. Development and pilot testing of a data-rich clinical reasoning training and assessment tool. Acad Med. 2022;97(10):1484-1488. DOI: 10.1097/ACM.0000000000004758 External link
Qureshi AA, Zehra T. Simulated patient’s feedback to improve communication skills of clerkship students. BMC Med Educ. 2020;20(1):15. DOI: 10.1186/s12909-019-1914-2 External link
Lovink A, Groenier M, van der Niet A, Miedema H, Rethans JJ. The contribution of simulated patients to meaningful student learning. Perspect Med Educ. 2021;10(6):341-6. DOI: 10.1007/s40037-021-00684-7 External link
Scarff CE, Bearman M, Chiavaroli N, Trumble S. Trainees’ perspectives of assessment messages: a narrative systematic review. Med Educ. 2019;53(3):221-33. DOI: 10.1111/medu.13775 External link
Dhaliwal G. Developing teachers of clinical reasoning. Clin Teach. 2013;10(5):313-317. DOI: 10.1111/tct.12082 External link
Sudack M, Adler M, Durning SJ, Edelbring S, Frankowska A, Hartmann D, Hege I, Huwendiek S, Sobočan M, Thiessen N, Wagner FL, Kononowicz AA. Why is it so difficult to implement a longitudinal clinical reasoning curriculum? A multicenter interview study on the barriers perceived by European health profession educators. BMC Med Educ. 2021;21(1):575. DOI: 10.1186/s12909-021-02960-w External link
Björklund K, Stenfors T, Nilsson GH, Leanderson C. Multisource feedback in medical students’ workplace learning in primary health care. BMC Med Educ. 2022;22(1):401. DOI: 10.1186/s12909-022-03468-7 External link
Dewan M, Norcini J. A purpose driven fourth year of medical school. Acad Med. 2018;93(4):581-585. DOI: 10.1097/ACM.0000000000001949 External link
Gilkes L, Kealley N, Frayne J. Teaching and assessment of clinical diagnostic reasoning in medical students. Med Teach. 2022;44(6):650-656. DOI: 10.1080/0142159X.2021.2017869 External link