gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

How well do final year undergraduate medical students master practical clinical skills?

article Clinical skills

  • corresponding author Sylvère Störmann - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Melanie Stankiewicz - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Patricia Raes - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Christina Berchtold - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Yvonne Kosanke - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Gabrielle Illes - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Peter Loose - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany
  • Matthias W. Angstwurm - Klinikum der Universität München, Medizinische Klinik und Poliklinik IV, München, Germany

GMS J Med Educ 2016;33(4):Doc58

doi: 10.3205/zma001057, urn:nbn:de:0183-zma0010578

This is the English version of the article.
The German version can be found at: http://www.egms.de/de/journals/zma/2016-33/zma001057.shtml

Received: September 30, 2015
Revised: March 11, 2016
Accepted: May 10, 2016
Published: August 15, 2016

© 2016 Störmann et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.


Abstract

Introduction: The clinical examination and other practical clinical skills are fundamental to guide diagnosis and therapy. The teaching of such practical skills has gained significance through legislative changes and adjustments of the curricula of medical schools in Germany. We sought to find out how well final year undergraduate medical students master practical clinical skills.

Methods: We conducted a formative 4-station objective structured clinical examination (OSCE) focused on practical clinical skills during the final year of undergraduate medical education. Participation was voluntary. Besides the examination of heart, lungs, abdomen, vascular system, lymphatic system as well as the neurological, endocrinological or orthopaedic examination we assessed other basic clinical skills (e.g. interpretation of an ECG, reading a chest X-ray). Participants filled-out a questionnaire prior to the exam, inter alia to give an estimate of their performance.

Results: 214 final year students participated in our study and achieved a mean score of 72.8% of the total score obtainable. 9.3% of participants (n=20) scored insufficiently (<60%). We found no influence of sex, prior training in healthcare or place of study on performance. Only one third of the students correctly estimated their performance (35.3%), whereas 30.0% and 18.8% over-estimated their performance by 10% and 20% respectively.

Discussion: Final year undergraduate medical students demonstrate considerable deficits performing practical clinical skills in the context of a formative assessment. Half of the students over-estimate their own performance. We recommend an institutionalised and frequent assessment of practical clinical skills during undergraduate medical education, especially in the final year.

Keywords: practical examination, OSCE, physical examination, clinical skills


1. Introduction

The basic clinical examination is a fundamental skill of physicians that facilitates diagnosis and therapy [1], [2], [3]. Technical progress has changed the practice of medicine, which increasingly relies on laboratory assessments, diagnostic imaging, and other sources of technical examination. Some authors mourn that this has led to a deterioration of the ability to perform a systematic and focused hands-on clinical examination [4], [5]. Medical students and young physicians alike demonstrate deficits in performing a clinical examination [6], [7]. A legislative reform in Germany in 2002 (Medical Licensure Act) pushed the faculties to develop new curricula with an emphasis on practical training [8], [9], [10], [11], [12], [13], [14]. By specifically defining the type and amount, this law formalised clinical practical teaching and its curricular design [15]. This lead to novel teaching concepts as well as whole curricula centred on medical skills [16], [17], [18], [19], [20]. An intensive training of practical clinical skills during the first years of undergraduate medical education aims to prepare students well for their future role as physicians; the final year is an important landmark in undergraduate training and consists mainly of practical exercise of previously and newly acquired skills [21]. The Medical Curriculum (MeCuM) integrates clinical training into pre-clinical courses during year 2 of undergraduate studies, starting with history taking. Through a variety of diverse formats (lectures, bedside teaching, peer teaching, and blended learning) students learn the theoretical basis of the clinical examination and have the opportunity to put it into practice. Longitudinal internships in general medical practices and frequent bedside courses allow practical exercise and feedback discussions steer the learning. As students progress, further practical clinical skills are taught in the context of their respective system (such as writing and reading an ECG as part of the cardiovascular teaching block, interpretation of a chest X-ray during the respiratory teaching block). Higher-level skills such as clinical decision-making are part of the formation during the final year when students should be proficient in the practice of basic clinical skills. However, a comparison of the performance in the Licensure Examination before and after the reform at both medical faculties in Munich (LMU und TU) showed a statistically significant decline of scores in the oral and practical part of the exam [22]. Changes in medical curricula in Germany and their impact on the increase in medical knowledge during clinical training are well studied [23], [24]; investigations of learned practical skills and achievement of competence based learning objectives are lacking. It is therefore unclear, how well undergraduate medical students receiving training more focused on clinical skills effectively master these skills in their final year.

Furthermore, the adequate self-assessment of performance and therefore one’s own limitations play a crucial role in the care for patients. Everyone involved in patient care should seek help in case of overload [25]. It is important for every (aspiring) physician to recognise limits of one’s own abilities and to prevent harm through erroneous action or even faulty omission. Danger lies in over-estimation (unconscious incompetence) as well as under-estimation (unconscious competence) of one’s abilities [26]. Multiple studies have shown that subjective self-assessment and objectively measured performance do not necessarily correlate [27], [28], [29]. Undergraduate medical students have a responsibility towards their patients as well as their teachers to estimate their skills adequately in order to improve on deficits and further develop strengths. This holds especially true as the physical examination acts as a cornerstone of diagnosis [2], [30]. Typically, final year medical students in Germany are first to see admitted patients. Therefore, a realistic self-assessment of examination proficiency is vital for the patients’ well-being and further course of hospital stay. We wanted to know how well undergraduate students in an advanced and critical part of their training could estimate their abilities to perform basic clinical skills.


2. Methods

Undergraduate medical students in their final year could participate in a formative oral and practical examination (“mündlich-praktische Prüfung im PJ”, abbreviated: mP3) using the OSCE format (objective structured clinical examination) from mid-2011 through 2014. The intent of this examination was to offer the participants the possibility to objectively assess practical clinical skills and obtain individual feedback as to identify strengths and weaknesses. The examination consisted of four OSCE stations. The stations covered various aspects of the physical examination: heart, lungs, abdomen, and vascular/lymphatic system as well as neurological, endocrinological, and orthopaedic examination. Amongst others, the stations covered basic clinical skills such as writing and reading of a 12-lead ECG, basic interpretation of an abdominal CT scan, identifying normal and abnormal findings on a chest X-ray, outlining the management of an emergency in the ER, enumerate important laboratory parameters to aid differential diagnosis in specified clinical settings). Two thirds of each station is devoted to the physical examination, the remaining third assesses other clinical skills. Each instalment of the OSCE consisted of stations compiled from a pool of 12 different OSCE stations. An expert panel designed and validated all stations. Participants performed the physical examination on probands instructed not to give any feedback during the examination. Marks for specific steps of the physical examination were awarded only if that step was performed correctly in its entirety. Each station lasted precisely 12 minutes. Afterwards, students obtained 3 minutes feedback from the examiner. All 19 examiners were faculty staff members with experience in examining OSCEs as well as professional experience as clinicians. Frequently held workshops for faculty by our Institute for Didactics and Education Research ensure a high standard of quality in the implantation of assessments such as the OSCE.

Participants voluntarily filled-out a questionnaire referring to personal and demographic details, the course of studies, prior training in healthcare (e.g. as paramedic or as nurse) as well as the assumed mark achieved in the examination (5 point scale as commonly used in Germany for school grading: exam mark 1 = “excellent” to 5 = “insufficient”). To allow for comparison of the self-assessment with the OSCE score (expressed as percentage of total achievable points), we converted the OSCE score into the same 5 point scale according to a conversion scheme common in Germany and used in the National License Examinations [8]. Students received a notification of their achieved score after the examination.

All statistical analyses were performed using SPSS (IBM Corp., Armonk, NY, U.S.A.). For the difference between two means, t-tests were used; in cases with multiple groups, an analysis of variance was performed. Effect sizes were assessed using Cohen’s d. p values α=0.05 were considered statistically significant.

The operational sequence, purpose, and intention of scientific interpretation of the data of this practical examination were announced to the local ethics committee, which deemed a formal ethical approval not necessary. The study was conducted according to principles of the World Medical Association’s Declaration of Helsinki and Declaration of Geneva. All undergraduate medical students in their final year could participate. Participation was voluntary and participants gave written consent to the scientific analysis of the examination and publication of results. Not consenting did not exclude students from the examination.


3. Results

Study population

214 students participated in the study from mid-2011 until the end of 2014. Median age of participants was 26.3 (±4.5) years. Almost two thirds (64.0%; n=137) were female, thus corresponding to the gender distribution of all undergraduate medical students at the LMU Munich. There was no significant age difference between female and male participants (m=27.5±3.2 years; f=27.1±5.1 years; p=0.544). Most participants (n=156; 72.9%) had pursued their medical studies at the LMU Munich from the beginning; the other participants had joined the LMU Munich at later stages of their undergraduate studies.

Total performance

On average participants achieved 72.8%±10.1% of the maximum total score (see Figure 1 [Fig. 1]). After converting the performance score into a 5 point scale only 3.7% achieved an examination mark of “1” (“excellent”; score=90%; n=8), 23.4% a mark of “2” (“good”; score 80-90%; n=50), 34.6 a mark of “3” (“fair”; score 70-80%; n=74), and 29.0% a mark of “4” (“poor”; score 60-70%; n=62). Twenty students (9.3%) had an “insufficient” score (defined as<60%; mark of “5”).

Confounding factors

Female participants had a tendency towards slightly higher scores; however, this difference was not significant (73.7%±10.1% versus 71.1%±9.9%; p=0.069). There were no significant differences in scores between participants who had studied at the LMU Munich from the beginning vs. at later stages (p=0.349). A prior training in healthcare did not yield other scores than without prior training (p=0.363). Scores were homogenously distributed amongst participants from 2011 until 2014 (p=0.881).

The majority of participants stated not having prepared themselves specifically for the exam (63.1%). They achieved significantly lower scores in comparison to prepared students (71.8%±9.3% vs. 78.6%±10,0%; p<0,001; d=0,27). For an overview of these results, cf. Table 1 [Tab. 1].

Self-assessment

170 participants (79.4%) gave an estimate of their performance in the examination. Self-assessed performance and total examination score correlated positively and significantly (r=0.26; p<0.001). On average, students over-estimated their performance by half an examination mark. 60 participants (35.3%) correctly assessed their performance. 51 students (30.0%) over-estimated their performance by one, 32 participants (18.8%) by two marks. 21 students (12.4%) under-estimated their performance by one, 6 participants (3.5%) by two marks. Of the 20 participants with a total score below 60% (“insufficient”) 16 had self-assessed their performance of which 13 (81.3%) were over-estimating. Confer to Figure 2 [Fig. 2] for an overview of self-assessment in relation to total score.


4. Discussion

Practical clinical skills such as the physical examination remain an important instrument in the physician’s armamentarium. Our analysis of a formative, oral-practical examination in undergraduate medical students in their final year showed a lack of these skills despite the advanced course of studies and immanent licensure. Our participants had trouble performing a physical examination as well as basic clinical procedures such as writing and reading an ECG. A comparable analysis in American students during the USMLE Step 2 Clinical Skills Examination yielded similar results [31]. Recently Schmidmaier et al. used a progress test to show that knowledge of internal medicine continuously increases at the LMU Munich [24]. However, these results are not generalizable onto practical clinical skills [32].

In Germany acquiring new and improving on existing skills during the final year of undergraduate medical studies relies heavily on the supervision and patronage of the ward’s physicians where students spend their final year. In practice, supervision is lacking and the acquisition and improvement of skills depends largely on chance and the individual commitment of the students [33], [34], [35]. A rather new approach is to follow the development of clinical skills with a progress test longitudinally [36], [37], [38]; so far, published data are lacking.

Learning practical clinical skills requires complex interventions and a seamless interaction between all parties involved (medical faculty, teaching hospitals, and other hospitals/practices where students complete clinical traineeships). In reality, this is hardly controllable and students develop a large part of their “clinical practice” outside class [39]. Effectively this means that an important part of medical training is beyond the grasp of university structures and therefore escapes institutional quality standards. So far, teaching of practical clinical skills at faculty level focuses on the use of skills labs [40], [41] where peers mostly perform teaching (student tutors). Multiple studies have shown this concept to be effective [42], [43]. Structured formats improve practical clinical skills acquired in the skills lab lastingly [44]. Another mechanism is to perform intermittent formative examinations and make use of the “assessment drives learning” effect [45]. In respect to the data presented herein, it seems important to perform formative examinations assessing clinical skills as measures of quality assurance during the final year of undergraduate medical education [46]. Ideally, these examinations should be composed of assessments of diverse skills and compiled from an exhaustive catalogue [47]. In light of high costs of these examination formats [48], [49], faculties have to rethink how to find affordable solutions to improve the teaching of clinical skills, such as special tutorship programmes [50]. Important questions in this context are: Who profits from such formative examinations? How frequent should they be? At what point in time during the course of medical studies should the first examination take place? Is the OSCE the right format for such examinations? There are no answers to these questions derived from generalizable recommendations from the literature. To select specific students (and therefore to favour those) might seem ethically ill advised. However, additional interventions have proven helpful in those students at risk of attrition [51], [52]. From an economical stance, it could be justified to limit additional resources to those students at risk. A possible compromise (albeit increasing administrative overhead) could be to offer a certain minimum of such examinations to all students and to examine students at risk more frequently. How often and from which year of undergraduate education on is unclear as data is scarce [53]. We think that such examinations should begin early on to prevent giving feedback too late, i.e. when false manoeuvres have already become routine. Alternatives are other formats such as the Mini-CEX and others (CEC, DOPS) that can take place directly at the “workplace”. As such, they offer interesting possibilities to assess clinical skills intermittently with comparatively little à priori effort [54], [55], [56]. Conversely, more effort is required to instruct and train examiners correctly for these formats.

One third of all students correctly self-assessed their performance. Almost half of our participants over-estimated their performance; nearly one in five to a vast degree. This is more than previously described [29], [57]. The phenomenon is not new and neither limited to medical studies nor students per se [58], [59]. Our students received structured and qualitative feedback after the exanimation. Many students were surprised when they realized how off they were in their self-assessment. Consequences of over-estimation can be serious, in particular when it leads to diagnostic and/or therapeutic errors [60], [61], [62]. The ability to correctly self-evaluate is difficult to teach or train. It was postulated that video feedback would be sufficient to improve self-assessment [63]. A study by Hawkins and colleagues achieved improvement only retrospectively when a video of the students’ performance was juxtaposed a video demonstrating the correct manoeuvres [64]. It is therefore important to give the students a good idea of their performance through structured feedback, but also to show the correct execution of the skill assessed. Criticism must always be communicated in a meaningful way and it has to be noted that responsiveness to feedback is modulated by expectations and attitudes [65].

We presented data from a formative and voluntary examination, for which students had to register actively. This may have biased our results. One would expect eager and especially motivated students to participate in an optional examination leading to false-positive results. In so far we deem our data and the conclusions derived from them to be plausible. Our examination is well accepted and students use it to prepare themselves for the final part of the Licensure Examination, which also takes place as an oral and practical exam. A strength of our data is the size of the study population that allows for reliable statements even when effect size is small.


5. Conclusions

The performance of undergraduate medical students in their final year during a formative oral and practical clinical examination leaves ample room for improvement. Almost two thirds of our participants scored “fairly” or “poorly”; one in ten students fails. Practically half of our students over-estimated their own performance. An established, standardized, formative examination during the final year of undergraduate studies seems vital.


Acknowledgements

We thank our examiners and colleagues for their support and all the efforts put into implementing the mP3 examination: Prof. Dr. Ralf Schmidmaier, PD Dr. Peter Reilich, PD Dr. Stefan Grote, PD Dr. Peter Thaller, PD Dr. Philipp Baumann, Dr. Kathrin Schrödl, Dr. Philip von der Borch, Dr. Bert Urban, Dr. Michael Maier, Dr. Mark op den Winkel, Dr. Costanza Chiapponi, Anja Fischer, Caroline Strenkert, Andreas Sturm, and Christina Berr.


Competing interests

The authors declare that they have no competing interests.


References

1.
Kirch W, Schafii C. Misdiagnosis at a university hospital in 4 medical eras. Medicine (Baltimore). 1996;75(1):29–40. DOI: 10.1097/00005792-199601000-00004 External link
2.
Reilly BM. Physical examination in the care of medical inpatients: an observational study. Lancet. 2003;362(9390):1100–1105. DOI: 10.1016/S0140-6736(03)14464-9 External link
3.
Kugler J, Verghese A. The physical exam and other forms of fiction. J Gen Intern Med. 2010;25(8):756–757. DOI: 10.1007/s11606-010-1400-3 External link
4.
Feddock CA. The lost art of clinical skills. Am J Med. 2007;120(4):374–378. DOI: 10.1016/j.amjmed.2007.01.023 External link
5.
Cook C. The lost art of the clinical examination: an overemphasis on clinical special tests. J Man Manip Ther. 2010;18(1):3–4. DOI: 10.1179/106698110X12595770849362 External link
6.
Wilkerson L, Lee M. Assessing physical examination skills of senior medical students: knowing how versus knowing when. Acad Med. 2003;78(10 Suppl):S30–S32.
7.
Ramani S, Ring BN, Lowe R, Hunter D. A pilot study assessing knowledge of clinical signs and physical examination skills in incoming medicine residents. J Grad Med Educ. 2010;2(2):232–235. DOI: 10.4300/JGME-D-09-00107.1 External link
8.
Bundesministerium für Gesundheit. Approbationsordnung für Ärzte. Berlin: Bundesministerium für Gesundheit; 2002.
9.
Eitel F. Die neue Approbationsordnung verlangt tief greifende Änderungen in der Lehrorganisation. Gesundheitswesen (Suppl Med Ausbild). 2002;19(Suppl1):1–2. Zugänglich unter/available from: https://gesellschaft-medizinische-ausbildung.org/files/ZMA-Archiv/2002/1/Eitel_F-ae.pdf External link
10.
Steiner T, Schmidt J, Bardenheuer H, Kirschfink M, Kadmon M, Schneider G, Seller H, Sonntag HG, für die HEICUMED Planungsgruppe. HEICUMED?: Heidelberger Curriculum Medicinale - Ein modularer Reformstudiengang zur Umsetzung der neuen Approbationsordnung. Gesundheitswesen (Suppl Med Ausbild). 2003;20(Suppl2):87–91. Zugänglich unter/available from: https://gesellschaft-medizinische-ausbildung.org/files/ZMA-Archiv/2003/2/Steiner_T,_J%C3%BCnger_J,_Schmidt_J,_Bardenheuer_H,_Kirschfink_M,_Kadmon_M,_Schneider_G,_Seller_H,_Sonntag_HG.pdf External link
11.
Schulze J, Drolshagen S, Nürnberger F, Ochsendorf F. Gestaltung des klinischen Studiums nach den Vorgaben der neuen Ärztlichen Approbationsordnung – Struktur und Organisation. Gesundheitswesen (Suppl Med Ausbild). 2003;20(Suppl2):68–77. Zugänglich unter/available from: https://gesellschaft-medizinische-ausbildung.org/files/ZMA-Archiv/2003/2/Schulze_J,_Drolshagen_S,_N%C3%BCrnberger_F,_Ochsendorf_F.pdf External link
12.
van den Bussche H. Lehren und Lernen am UKE - Die Umsetzung der Approbationsordnung für Ärzte in Hamburg. Z Allgemeinmed. 2004;80:431–437. DOI: 10.1055/s-2004-820414 External link
13.
Huwendiek S, Kadmon M, Jünger J, Kirschfink M, Bosse HM, Resch F, Duelli R, Bardenhauer HJ, Sonntag HG, Steiner T. Umsetzung der deutschen Approbationsordnung 2002 im modularen Reformstudiengang Heidel- berger Curriculum Medicinale (HeiCuMed). ZfHE. 2008;3(3):17–27. Zugänglich unter/available from: http://www.zfhe.at/index.php/zfhe/article/view/70/75 External link
14.
Zeuner S, Henke T, Achilles E, Kampmeyer D, Schwanitz P. 36 Different Ways To Study Medicine. GMS Z Med Ausbild. 2010;27(2):Doc20. DOI: 10.3205/zma000657 External link
15.
Haage H. Das neue Medizinstudium. Aachen: Shaker Verlag; 2003. S.320.
16.
Fischer T, Simmenroth-Nayda A, Herrmann-Lingen C, Wetzel D, Chenot JF, Kleiber C, Staats H, Kochen MM. Medizinische Basisfähigkeiten – ein Unterrichtskonzept im Rahmen der neuen Approbationsordnung. Z All Med. 2003;79:432–436. Zugänglich unter/available from: https://www.online-zfa.de/media/archive/2003/09/10.1055-s-2003-43063.pdf External link
17.
Öchsner W, Forster J. Approbierte Ärzte - kompetente Ärzte? Die neue Approbationsordnung für Ärzte als Grundlage für kompetenzbasierte Curricula. GMS Z Med Ausbild. 2005;22(1):Doc04. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2005-22/zma000004.shtml External link
18.
Kulike K, Hilgers J, Störmann S, Hornung T, Dudziak J, Weinmann P, et al. Kerncurriculum für die Medizinische Ausbildung in Deutschland: Ein Vorschlag der Medizinstudierenden Deutschlands. GMS Z Med Ausbild. 2006;23(4):Doc58. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2006-23/zma000277.shtml External link
19.
Schnabel KP, Boldt PD, Breuer G, Fichtner A, Karsten G, Kujumdshiev S, Schmidts M, Stosch C. Konsensusstatement "Praktische Fertigkeiten im Medizinstudium" – ein Positionspapier des GMA-Ausschusses für praktische Fertigkeiten. GMS Z Med Ausbild. 2011;28(4):Doc58. DOI: 10.3205/zma000770 External link
20.
Fischer MR, Bauer D, Mohn K. Endlich fertig! Nationale Kompetenzbasierte Lernzielkataloge Medizin (NKLM) und Zahnmedizin (NKLZ) gehen in die Erprobung. GMS Z Med Ausbild. 2015;32(3):Doc35. DOI: 10.3205/zma000977 External link
21.
Jilg S, Möltner A, Berberat P, Fischer MR, Breckwoldt J. How do Supervising Clinicians of a University Hospital and Associated Teaching Hospitals Rate the Relevance of the Key Competencies within the CanMEDS Roles Framework in Respect to Teaching in Clinical Clerkships?? GMS Z Med Ausbild. 2015;32(3):Doc33. DOI: 10.3205/zma000975 External link
22.
Seyfarth M, Reincke M, Seyfarth J, Ring J, Fischer MR. Neue ärztliche Approbationsordnung und Notengebung beim Zweiten Staatsexamen. Dtsch Arztebl Int. 2010;107(28-29):500–504.
23.
Raupach T, Vogel D, Schiekirka S, Keijsers C, Cate O Ten, Harendza S. WissenszuwachsimPraktischen Jahr des Medizinstudiums in Deutschland. GMS Z Med Ausbild. 2013;30(3):Doc33. DOI: 10.3205/zma000876 External link
24.
Schmidmaier R, Holzer M, Angstwurm M, Nouns Z, Reincke M, Fischer MR. Querschnittevaluation des Medizinischen Curriculums München (MeCuM) mit Hilfe des Progress Tests Medizin (PTM). GMS Z Med Ausbild. 2010;27(5):Doc70. DOI: 10.3205/zma000707 External link
25.
Happel O, Papenfuß T, Kranke P. Schockraummanagement – Simulation, Teamtraining und Kommunikation für eine bessere Traumaversorgung TT - Training for real: simulation, team-training and communication to improve trauma management. Anästhesiol Intensivmed Notfallmed Schmerzther. 2010;45(06):408–415. DOI: 10.1055/s-0030-1255348 External link
26.
Ingham H, Luft J. The Johari Window: a graphic model for interpersonal relations. Los Angeles: Proceedings of the western training laboratory in group development; 1955.
27.
Barnsley L, Lyon PM, Ralston SJ, Hibbert EJ, Cunningham I, Gordon FC, et al. Clinical skills in junior medical officers: A comparison of self-reported confidence and observed competence. Med Educ. 2004;38(4):358–367. DOI: 10.1046/j.1365-2923.2004.01773.x External link
28.
Mavis B. Self-efficacy and OSCE performance among second year medical students. Adv Heal Sci Educ. 2001;6(2):93–102. DOI: 10.1023/A:1011404132508 External link
29.
Jünger J, Schellberg D, Nikendei C. Subjektive Kompetenzeinschätzung von Studierenden und ihre Leistung im OSCE. GMS Z Med Ausbild. 2006;23(3):Doc51. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2006-23/zma000270.shtml External link
30.
Verghese A. A doctor's touch. Edinburgh: TEDGlobal2011; 2011.
31.
Peitzman SJ, Cuddy MM. Performance in Physical Examination on the USMLE Step 2 Clinical Skills Examination. Acad Med. 2015;90(2):209–213. DOI: 10.1097/ACM.0000000000000570 External link
32.
Vukanovic-Criley JM, Criley S, Warde CM, Boker JR, Guevara-Matheus L, Churchill WH, Nelson WP, Criley JM. Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: a multicenter study. Arch Intern Med. 2006;166(6):610–616. DOI: 10.1001/archinte.166.6.610 External link
33.
Engel C, Porsche M, Roth S, Ganschow P, Büchler MW, Kadmon M. Welche Ausbildungsinhalte brauchen Studierende im Praktischen Jahr - Bedarfsanalyse als Basis für ein chirurgisches PJ-Curriculum. GMS Z Med Ausbild. 2008;25(3):Doc89. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2008-25/zma000573.shtml External link
34.
Remmen R, Derese A, Scherpbier A, Denekens J, Hermann I, Van Der Vleuten C, et al. Can medical schools rely on clerkships to train students in basic clinical skills? Med Educ. 1999;33(8):600–605. DOI: 10.1046/j.1365-2923.1999.00467.x External link
35.
Köhl-Hackert N, Krautter M, Andreesen S, Hoffmann K, Herzog W, Jünger J, Nikendei C. Lernen am späteren Arbeitsplatz: eine Analyse studentischer Erwartungen an den Stationseinsatz im Fachbereich der Inneren Medizin. GMS Z Med Ausbild. 2014;31(4):Doc43. DOI: 10.3205/zma000935 External link
36.
Pugh D, Touchie C, Wood TJ, Humphrey-Murto S. Progress testing: is there a role for the OSCE? Med Educ. 2014;48(6):623–631. DOI: 10.1111/medu.12423 External link
37.
Pugh D, Touchie C, Humphrey-Murto S, Wood TJ. The OSCE progress test – Measuring clinical skill development over residency training. Med Teach. 2016;38(2):168-73. DOI: 10.3109/0142159X.2015.1029895 External link
38.
Gold J, DeMuth R, Mavis B, Wagner D. Progress testing 2.0: clinical skills meets necessary science. Med Educ Online. 2015;20.
39.
Duvivier RJ, van Geel K, van Dalen J, Scherpbier AJJ a, van der Vleuten CPM. Learning physical examination skills outside timetabled training sessions: what happens and why? Adv Health Sci Educ Theory Pract. 2012;17(3):339–355. DOI: 10.1007/s10459-011-9312-5 External link
40.
Blohm M, Lauter J, Branchereau S, Krauter M, Köhl-Hackert N, Jünger J, Herzog W, Nikendei C. "Peer-Assisted Learning" (PAL) im Skills-Lab – eine Bestandsaufnahme an den Medizinischen Fakultäten der Bundesrepublik Deutschland. GMS Z Med Ausbild. 2015;32(1):Doc10. DOI: 10.3205/zma000952 External link
41.
Profanter C, Perathoner A. DOPS (Direct Observation of Procedural Skills) im studentischen Skills-Lab: Funktioniert das? Eine Analyse der Performanz klinischer Fertigkeiten und der curricularen Nebeneffekte. GMS Z Med Ausbild. 2015;32(4):Doc45. DOI: 10.3205/zma000987 External link
42.
Yu TC, Wilson NC, Singh PP, Lemanu DP, Hawken SJ, Hill AG. Medical students-as-teachers: a systematic review of peer-assisted teaching during medical school. Adv Med Educ Pract. 2011;2:157–172.
43.
Weyrich P, Celebi N, Schrauth M, Möltner A, Lammerding-Köppel M, Nikendei C. Peer-assisted versus faculty staff-led skills laboratory training: a randomised controlled trial. Med Educ. 2009;43(2):113–120. DOI: 10.1111/j.1365-2923.2008.03252.x External link
44.
Herrmann-Werner A, Nikendei C, Keifenheim K, Bosse HM, Lund F, Wagner R, Celebi N, Zipfel S, Weyrich P. "Best Practice" Skills Lab Training vs. a "see one, do one" Approach in Undergraduate Medical Education: An RCT on Students' Long-Term Ability to Perform Procedural Clinical Skills. PLoS One. 2013;8(9):1–13. DOI: 10.1371/journal.pone.0076354 External link
45.
Buss B, Krautter M, Mo¨ltner A, Weyrich P, Werner A, Ju¨nger J, Nikendei C. Der "Assessment Drives Learning"-Effekt beim Training klinisch-praktischer Fertigkeiten - Implikationen für die Curriculumsgestaltung und die Planung von Ressourcen Zusammenfassung. GMS Z Med Ausbild. 2012;29(5):Doc70. DOI: 10.3205/zma000840 External link
46.
Raes P, Angstwurm M, Berberat P, Kadmon M, Rotgans J, Streitlein-Böhme I, Burckhardt G, Fischer MR. Qualitätsmanagement der klinisch-praktischenAusbildung im Praktischen Jahr des Medizinstudiums – Vorschlag eines Kriterienkatalogs der Gesellschaft für Medizinische Ausbildung. GMS Z Med Ausbild. 2014;31(4):Doc49. DOI: 10.3205/zma00094 External link
47.
Bouwmans GA, Denessen E, Hettinga AM, Michels C, Postma CT. Reliability and validity of an extended clinical examination. Med Teach. 2015;37(12):1072-1077. DOI: 10.3109/0142159x.2015.1009423 External link
48.
Brown C, Ross S, Cleland J, Walsh K. Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE). Med Teach. 2015;37(7):653-659. DOI: 10.3109/0142159x.2015.1033389 External link
49.
Rau T, Fegert J, Liebhardt H. Wie hoch liegen die Personalkosten für die Durchführung einer OSCE? Eine Kostenaufstellung nach betriebswirtschaftlichen Gesichtspunkten. GMS Z Med Ausbild. 2011;28(1):Doc13. DOI: 10.3205/zma000725 External link
50.
Kraus B, Briem S, Jünger J, Schrauth M, Weyrich P, Herzog W, Zipfel S, Nikendei C. Entwicklung und Evaluation eines Schulungsprogramms für Studenten im Praktischen Jahr in der Inneren Medizin. GMS Z Med Ausbild. 2006;23(4):Doc70. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2006-23/zma000289.shtml External link
51.
Yates J. Development of a "toolkit" to identify medical students at risk of failure to thrive on the course: an exploratory retrospective case study. BMC Med Educ. 2011;11(1):95. DOI: 10.1186/1472-6920-11-95 External link
52.
Winston KA, van der Vleuten CP, Scherpbier AJ. Prediction and prevention of failure: An early intervention to assist at-risk medical students. Med Teach. 2014;36(1):25–31. DOI: 10.3109/0142159X.2013.836270 External link
53.
Bosse HM, Mohr J, Buss B, Krautter M, Weyrich P, Herzog W, Jünger J, Nikendei C. The benefit of repetitive skills training and frequency of expert feedback in the early acquisition of procedural skills. BMC Med Educ. 2015;15(1):22. DOI: 10.1186/s12909-015-0286-5 External link
54.
Van Der Vleuten CP, Schuwirth LW. Assessing professional competence: From methods to programmes. Med Educ. 2005;39(3):309–317. DOI: 10.1111/j.1365-2929.2005.02094.x External link
55.
Epstein RM. Medical education - Assessment in medical education. N Engl J Med. 2007;356(4):387–396. DOI: 10.1056/NEJMra054784 External link
56.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29(9):855–871. DOI: 10.1080/01421590701775453 External link
57.
Abadel FT, Hattab AS. How does the medical graduates' self-assessment of their clinical competency differ from experts' assessment? BMC Med Educ. BMC Med Educ. 2013;13(1):24. DOI: 10.1186/1472-6920-13-24 External link
58.
Falchikov N. Student self-assessment in higher education: A meta-analysis. Rev Educ Res. 1989;59(359):394–430. DOI: 10.3102/00346543059004395 External link
59.
Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of Physician Self-assessment Compared With Observed Measures of Competence. JAMA. 2006;296(9):1094. DOI: 10.1001/jama.296.9.1094 External link
60.
Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008;121(5 Suppl):S2–23.
61.
Cavalcanti RB, Sibbald M. Am I right when I am sure? Data consistency influences the relationship between diagnostic accuracy and certainty. Acad Med. 2014;89(1):107–113. DOI: 10.1097/ACM.0000000000000074 External link
62.
Verghese A, Charlton B, Kassirer JP, Ramsey M, Ioannidis JP. Inadequacies of Physical Examination as a Cause of Medical Errors and Adverse Events: A Collection of Vignettes. Am J Med. 2015;128(12):1322-1324. DOI: 10.1016/j.amjmed.2015.06.004 External link
63.
Lane JL, Gottlieb RP. Improving the interviewing and self-assessment skills of medical students: is it time to readopt videotaping as an educational tool? Ambul Pediatr. 2004;4(3):244–248. DOI: 10.1367/A03-122R1.1 External link
64.
Hawkins SC, Osborne A, Schofield SJ, Pournaras DJ, Chester JF. Improving the accuracy of self-assessment of practical clinical skills using video feedback – The importance of including benchmarks. Med Teach. 2012;34(4):279–284. DOI: 10.3109/0142159X.2012.658897 External link
65.
Eva KW, Armson H, Holmboe E, Lockyer J, Loney E, Mann K, Sargeant J. Factors influencing responsiveness to feedback: on the interplay between fear, confidence, and reasoning processes. Adv Health Sci Educ Theory Pract. 2012;17(1):15–26. DOI: 10.1007/s10459-011-9290-7 External link