gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

Medical knowledge and teamwork predict the quality of case summary statements as an indicator of clinical reasoning in undergraduate medical students

article Clinical Reasoning

  • author Sophie Fürstenberg - University Medical Center Hamburg-Eppendorf, III. Department of Internal Medicine, Hamburg Germany
  • author Viktor Oubaid - German Aerospace Center (DLR), Hamburg, Germany
  • author Pascal O. Berberat - Technical University of Munich, TUM Medical Education Center, School of Medicine, Munich, Germany
  • author Martina Kadmon - University of Augsburg, Faculty of Medicine, Deanery, Augsburg, Germany
  • corresponding author Sigrid Harendza - University Medical Center Hamburg-Eppendorf, III. Department of Internal Medicine, Hamburg Germany

GMS J Med Educ 2019;36(6):Doc83

doi: 10.3205/zma001291, urn:nbn:de:0183-zma0012914

This is the English version of the article.
The German version can be found at:

Received: October 31, 2018
Revised: September 8, 2019
Accepted: September 26, 2019
Published: November 15, 2019

© 2019 Fürstenberg et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at


Background: Clinical reasoning refers to a thinking process including medical problem solving and medical decision making skills. Several studies have shown that the clinical reasoning process can be influenced by a number of factors, e.g. context or personality traits, and the results of this thinking process are expressed in case presentation. The aim of this study was to identify factors, which predict the quality of case summary statements as an indicator of clinical reasoning of undergraduate medical students in an assessment simulating the first day of residency.

Methods: To investigate factors predicting aspects of clinical reasoning 67 advanced undergraduate medical students participated in the role of a beginning resident in our competence-based assessment, which included a consultation hour, a patient management phase, and a handover. Participants filled out a Post Encounter Form (PEF) to document their case summary statements and other aspects of clinical reasoning. After each phase, they filled out the Strain Perception Questionnaire (STRAIPER) to measure their situation dependent mental strain. To assess medical knowledge the participants completed a 100 questions multiple choice test. To measure stress resistance, adherence to procedures, and teamwork students took part in the Group Assessment of Performance (GAP) test for flight school applicants. These factors were included in a multiple linear regression analysis.

Results: Medical knowledge and teamwork predicted the quality of case summary statements as an indicator of clinical reasoning of undergraduate medical students and explained approximately 20.3% of the variance. Neither age, gender, undergraduate curriculum, academic advancement nor high school grade point average of the medical students of our sample had an effect on their clinical reasoning skills.

Conclusion: The quality of case summary statements as an indicator of clinical reasoning can be predicted in undergraduate medical students by their medical knowledge and teamwork. Students should be supported in developing abilities to work in a team and to acquire long term knowledge for good case summary statements as an important aspect of clinical reasoning.

Keywords: clinical reasoning, competence-based assessment, knowledge, strain, teamwork


Clinical reasoning is the cognitive process physicians use to investigate patients’ problems based on the information gathered from history, physical examination, and sometimes additional test results to make a diagnosis [1]. The different aspects and skills of clinical reasoning need to be developed during undergraduate medical training, e.g. in seminars [2], and should be refined during postgraduate medical education [3]. The context of clinical encounters has an additional influence on the way clinical reasoning is executed [4]. Within the diagnostic process, different levels of expertise are associated with different approaches of reasoning [5] and include the intuitive and the hypothetic deductive way for making decisions [6].

Case presentations are used as a teaching format which includes and summarizes aspects of clinical reasoning [7]. Tools have been developed to teach and assess indicators of clinical reasoning during oral or written case presentations which constitute the result of the diagnostic thinking process [8], [9]. Presenting patients in front of others, e.g. during morning report, is a necessary requirement to summarize and share the clinical reasoning process with colleagues but can be a stressful experience [10]. Furthermore, knowledge plays an important role in how new information is processed and reflect upon in the thinking process [11]. Additionally, contextual factors such as emotional reactions can impact on medical students’ clinical reasoning [12]. Signs of strain perception, which often occur with the start of medical students’ clinical rotations [13], might influence clinical reasoning negatively, because if demanding work becomes straining, its outcome can be reduced [14]. Therefore, stress resistance is a desired characteristic in medical students, which is sometimes included in multiple mini interviews for medical school admission [15].

In addition to contextual factors, personality specific factors can also influence academic performance [16]. During Objective Structured Clinical Examinations (OSCEs), for instance, advanced medical students were observed to show high scores in procedural skills, which require a certain amount of adherence to procedures [17]. Good teamwork also plays an indispensable role in in the daily routine on the ward [18] and especially the distribution and exchange of information within a team are crucial for the clinical reasoning process [19]. In medical education, real or virtual simulations have been shown to elucidate aspects of clinical reasoning and provide insights into developing clinical reasoning teaching or assessment formats [20], [21]. Aspects of clinical reasoning or the results of the clinical reasoning process and its factors of influence can be assessed in three different ways, namely by non-workplace-based assessment, by workplace-based assessment or by assessment in simulated clinical environments [22]. Taken together, further studies are needed to explore factors of potential influence on the clinical reasoning process resulting in possible differences of case presentation quality.

In this study we examined, which factors can predict the quality of case summary statements in undergraduate medical students in an assessment simulating the first day of residency. We focus particularly on case presentation as the result of the clinical reasoning process. Based on the current literature we hypothesize that medical knowledge, perceived strain, stress resistance, teamwork, and adherence to procedures are primary predictors for the quality of case summary statements as an important indicator of clinical reasoning in our simulated medical context.



The study took place in July 2017. The evaluation of the quality of case summary statements as an indicator of clinical reasoning was part of a 360 degree competence assessment of undergraduate medical students in the role of beginning residents in a simulated first workday of residency [23]. This assessment was based on the selected competences relevant for beginning residents [24] and it represented a maximal simulation of a clinical environment. Every participant held an individual consultation hour with five simulated patients. The patient cases could not be solved by pattern recognition and included a woman with atrial fibrillation, a man with granulomatous polyangiitis, a woman with perforated sigma diverticulitis, a man with covered perforated infrarenal aortic aneurysm and an immunosuppressed woman with herpes zoster [23].This was followed by a management phase (2.5 hours), where the participants could organize their patients’ next diagnostic steps and interacted with other health care personnel. Eventually, the participants handed their patients over to a resident in 30 minutes.


The participants of our study were 67 advanced undergraduate medical students (semester 10 to 12) from three medical schools with different curricula of 12 semesters who had volunteered to participate. They were selected on first-come first-served terms and received a 25 Euro book voucher after completion of the assessment. Data from five had to be excluded, because their data sets were incomplete with respect to the summary statements. Data from 62 participants (n=32 from the University of Hamburg, n=6 from the University of Oldenburg, n=24 from the Technical University Munich) were included in our analysis. The mean age of the 35 female and 27 male students was 26.1±2.2 years.


During the 360-degree assessment, participants filled out one free text Post Encounter Form (PEF) [9] per patient during the management phase of the assessment. This form provides a scoring system including the items summary statement, list of problems, list of differential diagnoses, most likely diagnosis, and supporting data for most likely diagnosis. In our study, we focused only on the summary statement as an important indicator of clinical reasoning [2], which provided the basis for the patient handover. The summary statements were assessed as one aspect of clinical reasoning by an experienced physician with respect to two aspects: the adequacy of the presentation of the summary with respect to medical content (rating scale 1) and with respect to the use of proper medical terminology for symptoms and findings (rating scale 2). Rating scale 1 included the following 5 point Likert scale [9]: 1=“Unable to summarize”, 2=“Poor/ inadequate summary”, 3=“Adequate summary”, 4=“Well summarized, recognizes key details”, 5=“Outstanding summary, demonstrates understanding”. Rating scale 2, also a 5 point Likert scale, consists of the following items: 1=“Uses lay terms or patient’s word”, 2=“Incorrect use of medical language”, 3=“Correctly uses some medical terminology”, 4=“Frequently and correctly uses medical terminology”, 5=“Advanced fluency in medical terminology, eloquent and concise” [9]. We calculated a score of the combined results from scale 1 and 2 per participant [9], resembling the mean of the scores for the five patients per participant: 1=“Participant's performance is below expectations”, 2=“Participant's performance is partly in line with expectations”, 3=“Participant's performance is in line with expectations”, 4=“Participant's performance exceeds expectations”, and 5=“Participant's achievements exceed expectations by far”. The internal consistency (Cronbach’s alpha) of the assessment of the case summary statements of the PEF based on the average of these two items in our study sample was .57.

After each of the three phases of the 360 degree assessment, we measured students’ perceived strain, an aversive response produced in the individual by potentially harmful exposure, which can be expressed as stress in its strongest occurrence [25]. The students filled out the Strain Perception Questionnaire (STRAIPER) [26], which is based on the QCD (Short questionnaire on current disposition) by Müller and Basler [27] and includes the following six bi polar items: tension (calm versus tense), doubt (confident versus doubtful), concern (unconcerned versus worried), agitation (unwound versus agitated), discomfort (comfortable versus uncomfortable), and apprehensiveness (relaxed versus apprehensive). The questionnaire serves the quantification of situation-dependent subjective mental strain. The scale includes a 6-point Likert scale (1=very low level to 6=very high level) for every questionnaire item. In the current analysis, only the STRAIPER measurements after the consultation hour were included as means of the six items. The Cronbach’s alpha of the STRAIPER is .78.

All participants completed a case based multiple choice test with 100 questions (and a maximum of 100 points) to assess their medical knowledge one week before the assessment. This knowledge test was compiled from 1000 freely available United States Medical Licensing Examination Step 2 type items [28]. The selection process of the questions is described elsewhere [29].

Furthermore, participants of the 360 degree assessment additionally participated one day later in part of the Group Assessment of Performance (GAP) test [30] used for testing of flight school applicants [31]. It contains a validated 1.5 hours computerized team task to evaluate social and interactive skills. The following competences were assessed: stress resistance (SR), adherence to procedures (AP), and teamwork (TW). Stress resistance is defined as maintaining effective performance, control, and goal orientation under pressure or adversity. Stress resistance includes also the absence of physiological symptoms (vegetative, motoric or verbal). The competence of adherence to procedures is defined by knowledge and disciplined and correct application of rules. Teamwork is characterized by active and constructive cooperation in the group process as well as by asking for ideas and perspectives of others. A comprehensive description of these competences and their facets was given earlier [31]. The observation of the participants was carried out by two DLR aviation psychologist with more than 15 years and 2000 cases of experience in behavioral observation. The observers used a set of empirically derived behaviour checklists [26] to assess each competence on a 6-point Likert scale (1: very low occurrence to 6: very high occurrence). The interrater reliabilities for the DLR pilot assessment center using the GAP-behaviour-observation-procedure are: SR=.82, AP=.75, and TW=.88 [32].

Statistical analysis

The statistical calculations were performed with SPSS Statistics (version 23) and included a multiple linear regression analysis using a regression model with the following predictors: Medical knowledge, perceived strain, stress resistance, adherence to procedures, and teamwork as well as aspects of clinical reasoning as dependent variable. The significance was set on a p-value=.05. All requirements for our linear regression model were fulfilled. Our date showed additivity and linearity, independence of the residuals, variance in all predictors, normally distributed residuals, as well as no multicollinearity.


The mean of the quality of our participants’ summary statements as an indicator of clinical reasoning was 2.78 (±.58), whereby 5 was the highest score. On average, they reached 73.3 (±9.1) of 100 possible points in the medical knowledge multiple choice test. On a 6 point scale ranging from 1=“very low” to 6=“very high”, they showed a perceived strain of 3.87 (±.79), a stress resistance score of 4.11 (±.71), an adherence to procedures score of 5.51 (±.63), and a teamwork score of 3.49 (±.83).

Two of the predictors, i.e. medical knowledge (10.3%) and teamwork (10.0%), explained in combination a significant portion (20.3%) of the variance of aspects of clinical reasoning (R2=.203, F(5, 62)=2.844, p=.023), shown in table 1 [Tab. 1]. Medical knowledge could predict clinical reasoning aspects (β=.372, t(62)=2.788, p=.007) as well as teamwork (β=.401, t(62)=2.521, p=.015). The intercorrelation of all variables of the regression model is shown in table 2 [Tab. 2]. There are significant correlations between teamwork and every predictor in our model, whereas perceived strain and adherence to procedures correlate with no other variable. The regression model was controlled by age, gender, undergraduate curriculum, academic advancement, and high school grade point average, which had no effect on clinical reasoning. An overview of the regression model with all predictors (medical knowledge, perceived strain, stress resistance, adherence to procedures, teamwork) is provided in figure 1 [Fig. 1].


In our assessment, the quality of case summary statements as an indicator of clinical reasoning is predicted by two factors. Medical knowledge explained 10.3% of the variance of the quality of the case summary statements. It is the basis of clinical reasoning, and decision making cannot begin without the necessary knowledge on medical subjects [33]. In another study, postgraduate students with better knowledge of basic clinical care also showed better clinical reasoning skills in an exam [34]. The more medical students’ knowledge increased over the time in a progress test at a medical school with a problem-based curriculum, the greater certain indicators of their clinical reasoning became [35]. However, even though third year postgraduate medical students have greater medical knowledge, they were observed to commit similar heuristic errors in clinical reasoning to those of residents in their first year [36]. Proficiency in medical knowledge and clinical reasoning, particularly in presenting cases, is linked, since specialized vocabulary is acquired while students gain experience and improve their understanding of diseases [37].

The factor teamwork explained 10.0% of the variance in quality of the case summary statements as an indicator of clinical reasoning in our study. In a human patient simulation for pharmaceutical students, clinical judgement and problem-solving skills, which are needed for clinical reasoning, were improved in combination with teamwork while solving the cases successfully [38]. Dental students who participated in a problem based learning course also reported an increase in their teamwork and problem solving skills [39]. Teamwork has also been shown to facilitate the development of creative solutions to challenging problems [40]. Simulation based learning in teams during undergraduate medical education can also be fostered by using collaboration scripts [41]. Even though no explicit teamwork task for clinical reasoning was implemented in our 360 degree assessment, participants could discuss their thoughts on the management of the patients with their supervisor or other health care personnel and could request laboratory and radiology tests before writing their case summary statements.

Contextual factors like the emotional reaction of students (e.g. to patients’ conditions or behavior) and the physician patient relationship were found to have an impact on clinical reasoning [12]. In our study, students‘ perceived strain did not predict the quality of case summary statements as an indicator of clinical reasoning, which could be due to the fact that the students perceived moderate strain, but not actually stress during the simulated work day. However, no differences with respect to diagnostic accuracy and clinical reasoning arguments between stressed and less stressed students were found elsewhere [42].

During the GAP test students showed the highest scores for adherence to procedures compared to the scores for the other competences [30]. However, adherence to procedures did not predict clinical reasoning. Apparently, the clinical reasoning process, reflected by the quality of case summary statements, requires additional skills than just carefully following the rules. It refers to a thinking process, which includes medical problem solving and medical decision making skills [33]. Furthermore, it requires the ability to switch from intuitive to analytic thing to make correct diagnoses when patient cases are complex [43].

A strength of our study is the fact that students from medical schools with different undergraduate curricula and with different academic advancement participated. This enabled us to control for these factors in our analysis. A weakness of our study is that the summary statements were assessed by just one experienced physician. However, she has been teaching clinical reasoning for many years [2] and was involved in the design and operationalization of the 360 degree assessment [23]. Furthermore, good interrater reliability has been shown for the instrument [44] and the original instrument was used by one rater for the assessment of the case summary statements [9]. However, a better approach would have been to evaluate the whole Post Encounter Form and calculate a regression analysis. Unfortunately, only very few of the participating students completed the whole form which would have reduced the sample number to an extent that would have made a regression analysis impossible. Despite the low sample number, we were able to identify significant predictors of clinical reasoning measured with a validated scoring form [9]. On the other hand, the low reliability for the assessment of the case summary statements of the PEF is another limitation of or study. However, with our simulation we created a realistic environment to investigate factors, which can influence aspects of clinical reasoning, supported by the validated GAP-test used for flight school applicants [31].


Medical knowledge and teamwork predicted the quality of case summary statements as an indicator of clinical reasoning of undergraduate medical students from medical schools with different curricula and with different academic advancement during a simulated first day of residency. Teamwork supports a good quality of case summary statements as an aspect clinical reasoning, which might be due to teamwork involving social sensitivity and exchange of information. Thus, it might be useful to support medical students in developing the ability to work in teams as well as to acquire long term knowledge for improving the quality of their case summary statements as an indicator for clinical reasoning.


This study was part of the project ÄKHOM, supported by a grant from the Federal Ministry of Education and Research (BMBF), reference number: 01PK1501A/B/C. The study was performed in accordance with the Declaration of Helsinki and the Ethics Committee of the Chamber of Physicians, Hamburg, confirmed the innocuousness of the study with consented, anonymized, and voluntary participation (PV3649).


We thank all the medical students who participated in this study.

Competing interests

The authors declare that they have no competing interests.


Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39(1):98 106. DOI: 10.1111/j.1365-2929.2004.01972.x External link
Harendza S, Krenz I, Klinge A, Wendt U, Janneck M. Implementation of a Clinical Reasoning Course in the Internal Medicine trimester of the final year of undergraduate medical training and its effect on students' case presentation and differential diagnostic skills. GMS J Med Educ. 2017;34(5):Doc66. DOI: 10.3205/zma001143 External link
Kiran HS, Chacko TV, Murthy KA, Gowdappa HB. Enhancing the Clinical Reasoning Skills of Postgraduate Students in Internal Medicine Through Medical Nonfiction and Nonmedical Fiction Extracurricular Books. Mayo Clin Proc. 2016;91(12):1761 1768. DOI: 10.1016/j.mayocp.2016.07.022 External link
Durning S, Artino AR Jr, Pangaro L, van der Vleuten CP, Schuwirth L. Context and clinical reasoning: understanding the perspective of the expert's voice. Med Educ. 2011;45(9):927 938. DOI: 10.1111/j.1365 2923.2011.04053.x External link
Bordage G. Elaborated knowledge: a key to successful diagnostic thinking. Acad Med. 1994;69(11):883 885. DOI: 10.1097/00001888-199411000-00004 External link
Kahneman D. Thinking, Fast and Slow. New York City: Straus and Giroux; 2011.
Edwards JC, Brannan JR, Burgess L, Plauche WC, Marier RL. Case presentation format and clinical reasoning: a strategy for teaching medical students. Med Teach. 1987;9(3):285-292. DOI: 10.3109/01421598709034790 External link
Jain V, Rao S, Jinadani M. Effectiveness of SNAPPS for improving clinical reasoning in postgraduates: randomized controlled trial. BMC Med Educ. 2019;19(1):224. DOI:10.1186/s12909-019-1670-3 External link
Durning SJ, Artino A, Boulet J, La Rochelle J, Van der Vleuten C, Arze B, Schuwirth L. The feasibility, reliability, and validity of a post-encounter form for evaluating clinical reasoning. Med Teach. 2012;34:30-37. DOI: 10.3109/0142159X.2011.590557 External link
Sacher AG, Detsky AS. Taking the stress out of morning report: an analytic approach to the differential diagnosis. J Gen Intern Med. 2009;24(6):747 751. DOI: 10.1007/s11606 009-0953-5 External link
Hofer BK. Epistemological understanding as a metacognitive process: Thinking aloud during Online searching. Educ Psychol. 2004;39:43 55. DOI: 10.1207/s15326985ep3901_5 External link
McBee E, Ratcliffe T, Schuwirth L, O'Neill D, Meyer H, Madden SJ, Durning SJ. Context and clinical reasoning. Understanding the medical student perspective. Perspect Med Educ. 2018;7(4):256 263. DOI: 10.1007/s40037-018-0417-x External link
Compton MT, Carrera J, Frank E. Stress and depressive symptoms/dysphoria among US medical students results from a large, nationally representative survey. J Nerv Ment Dis. 2008;196(12):891 897. DOI: 10.1097/NMD.0b013e3181924d03 External link
Richter P, Hacker W. Mental stress: mental fatigue, monotony, satiety and stress. Band 2 von Spezielle Arbeits- und Ingenieurpsychologie in Einzeldarstellungen. Heidelberg, Berlin: Springer-Verlag; 1984.
Simmenroth-Nayda A, Görlich Y. Medical school admission test: advantages for students whose parents are medical doctors? BMC Med Educ. 2015;15:81. DOI: 10.1186/s12909-015-0354-x External link
De Feyter T, Caers R, Vigna C, Berings D. Unraveling the impact of the Big Five personality traits on academic performance: The moderating and mediating effects of self-efficacy and academic motivation. Learn Individ Differ. 2012;22(4):439-448. DOI: 10.1016/j.lindif.2012.03.013 External link
Sim JH, Abdul Aziz YF, Mansor A, Vijayananthan A, Foong CC, Vadivelu J. Students' performance in the different clinical skills assessed in OSCE: what does it reveal? Med Educ Online. 2015;20:26185. DOI: 10.3402/meo.v20.26185 External link
Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. Advances in Patient Safety: From Research to Implementation. Rockville, Md: Agency for Health Care Research and Quality; 2006. DOI: 10.1037/e448242006-001 External link
Kiesewetter J, Fischer F, Fischer MR. Collaborative Clinical Reasoning - A Systematic Review of Empirical Studies. J Contin Educ Health Prof. 2017;37(2):123-128. DOI: 10.1097/CEH.0000000000000158 External link
Blondon KS, Maître F, Muller-JugeV, Bochatav N, Cullati S, Hudelson P, Vu NV, Savoldelli GL, Nendaz MR. Interprofessional collaborative reasoning by residents and nurses in internal medicine: Evidence from a simulation study. Med Teach. 2017;39(4):360-367. DOI: 10.1080/0142159X.2017.1286309 External link
Hege I, Kononowicz AA, Kiesewetter J, Foster-Johnson L. Uncovering the relation between clinical reasoning and diagnostic accuracy - An analysis of learner's clinical reasoning process in virtual patients. PLoS One. 2018;13(10):e0204900. DOI: 10.1371/journal.pone.0204900 External link
Daniel M, Rencic J, Durning SJ, Holmboe E, Santen SA, Lang V, Ratcliffe T, Gordon D, Heist B, Lubarsky S, Estrada CA, Ballard T, Artino AR Jr, Sergio DA, Silva A, Cleary T, Stojan J, Gruppen LD. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med. 2019;94(6):902-912.
Harendza S, Berberat PO, Kadmon M. Assessing Competences in Medical Students with a Newly Designed 360-Degree Examination of a Simulated First Day of Residency: A Feasibility Study. J Community Med Health Educ. 2017;7:4. DOI: 10.4172/2161 0711.1000550 External link
Fürstenberg S, Schick K, Deppermann J, Prediger S, Berberat PO, Kadmon M, Harendza S. Competencies for first year residents - physicians' views from medical schools with different undergraduate curricula. BMC Med Educ. 2017;17:154. DOI: 10.1186/s12909 017-0998-9 External link
Kristensen TS. Job stress and cardiovascular disease: a theoretic critical review. J Occup Health Psychol. 1996;1(3):246-260. DOI: 10.1037/1076-8998.1.3.246 External link
Fürstenberg S, Prediger S, Kadmon M, Berberat PO, Harendza S. Perceived strain of undergraduate medical students during a simulated first day of residency. BMC Med Educ. 2018;18:322. DOI: 10.1186/s12909-018-1435-4 External link
Müller B, Basler HD. Kurzfragebogen zur aktuellen Beanspruchung (KAB). Weinheim: Beltz; 1993.
Le T, Vieregger K. First aid Q & A for the USMLE step 2 CK. 2nd ed. New York: McGraw-Hill; 2010.
Raupach T, Vogel D, Schiekirka S, Keijsers C, Ten Cate O, Harendza S. Increase in medical knowledge during the final year of undergraduate medical education in Germany. GMS Z Med Ausbild. 2013;30(3):Doc33. DOI: 10.3205/zma000876 External link
Harendza S, Soll H, Prediger S, Kadmon M, Berberat PO, Oubaid V. Assessing core competences of medical students with a test for flight school applicants. BMC Med Educ. 2019;19:9. DOI: 10.1186/s12909-018-1438-1 External link
Oubaid V, Zinn F, Gundert D. GAP: assessment of performance in teams - a new attempt to increase validity. In: De Voogt A, D'Olivera TC, editors. Mechanisms in the chain of safety: Research and operational experiences in aviation psychology. Aldershot (England): Ashgate; 2012. p.7-17.
Oubaid V, Zinn F, Klein J. Selecting pilots via computer based measurement of group and team performance - Development of a new Assessment Centre method. Proceedings of the 28th EEAP Conference. 2008. p.55-59
Elstein AS, Shulman L, Sprafka S. Medical Problem Solving: An analysis of clinical reasoning. Cambridge, Massachusetts: Havard University Press; 1978. DOI: 10.4159/harvard.9780674189089 External link
Tokuda Y, Soshi M, Okubo T, Nishizaki Y. Postgraduate medical education in Japan: Missed opportunity for learning clinical reasoning. J Gen Fam Med. 2018;19(5):152 153. DOI: 10.1002/jgf2.202 External link
Boshuizen HP, van der Vleuten CP, Schmidt HG, Machiels-Bongaerts M. Measuring knowledge and clinical reasoning skills in a problem-based curriculum. Med Educ. 1997;31:115 121. DOI: 10.1111/j.1365-2923.1997.tb02469.x External link
Rylander M, Guerrasio J. Heuristic errors in clinical reasoning. Clin Teach. 2016;13(4):287 290. DOI: 10.1111/tct.12444 External link
Melvin L, Cavalcanti RB. The oral case presentation: A key tool for assessment and teaching in competency-based medical education. JAMA. 2016;316(21):2187 2188. DOI: 10.1001/jama.2016.16415 External link
Vyas D, Ottis EJ, Caligiuri FJ. Teaching clinical reasoning and problem solving skills using human patient simulation. Am J Pharm Educ. 2011;75(9):189. DOI: 10.5688/ajpe759189 External link
Grady R, Gouldsborough I, Sheader E, Speake T. Using innovative group work activities to enhance the problem based learning experience for dental students. Eur J Dent Educ. 2009;13(4):190-198. DOI: 10.1111/j.1600-0579.2009.00572.x External link
Drinka TJK, Miller TF, Goodman BM. Characterizing motivational styles of professionals who work on interdisciplinary healthcare teams. J Interprof Care. 1996;10:51-61. DOI: 10.3109/13561829609082682 External link
Zottmann JM, Dieckmann P, Traszow T, Rall M, Fischer F. Just watching is not enough: Fostering simulation-based learning with collaboration scripts. GMS J med Educ. 2018;35(3):Doc35. DOI: 10.3205/zma001181 External link
Pottier P, Dejoie T, Hardouin JB, Le Loupp AG, Planchon B, Bonnaud A, Leblanc VR. Effect of stress on clinical reasoning during simulated ambulatory consultations. Med Teach. 2013;35(6):472-480. DOI: 10.3109/0142159X.2013.774336 External link
Van den Berge K, Mamede S. Cognitive diagnostic error in internal medicine. Eur J Intern Med. 2013;24(6):525-529. DOI: 10.1016/j.ejim.2013.03.006 External link
Wijnen-Meijer M, Van der Schaaf M, Booji E, Harendza S, Boscardin C, Van Wijngaarden J, Ten Cate TJ. An argument-based approach to the validation of UHTRUST: can we measure how recent graduates can be trusted with unfamiliar tasks? Adv Health Sci Educ Theory Pract. 2013;18(5):1009-1027. DOI: 10.1007/s10459-013-9444-x. External link