gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

Performance effects of simulation training for medical students – a systematic review

article simulation

  • corresponding author Niall McInerney - Mater Misericordiae University Hospital, UCD Centre for Precision Surgery, Dublin, Ireland; Mater Misericordiae University Hospital, Department of Surgery, Dublin, Ireland; University College Dublin, School of Medicine, Section of Surgery and Surgical Specialties, Dublin, Ireland
  • D. Nally - Mater Misericordiae University Hospital, Department of Surgery, Dublin, Ireland
  • M.F. Khan - Mater Misericordiae University Hospital, UCD Centre for Precision Surgery, Dublin, Ireland; Mater Misericordiae University Hospital, Department of Surgery, Dublin, Ireland; University College Dublin, School of Medicine, Section of Surgery and Surgical Specialties, Dublin, Ireland
  • H. Heneghan - University College Dublin, School of Medicine, Section of Surgery and Surgical Specialties, Dublin, Ireland; St. Vincent’s University Hospital, Department of Surgery, Dublin, Ireland
  • R.A. Cahill - Mater Misericordiae University Hospital, UCD Centre for Precision Surgery, Dublin, Ireland; Mater Misericordiae University Hospital, Department of Surgery, Dublin, Ireland; University College Dublin, School of Medicine, Section of Surgery and Surgical Specialties, Dublin, Ireland

GMS J Med Educ 2022;39(5):Doc51

doi: 10.3205/zma001572, urn:nbn:de:0183-zma0015725

This is the English version of the article.
The German version can be found at: http://www.egms.de/de/journals/zma/2022-39/zma001572.shtml

Received: February 6, 2022
Revised: June 19, 2022
Accepted: August 4, 2022
Published: November 15, 2022

© 2022 McInerney et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.


Abstract

Objective: Simulation based medical education (SBME) is fast becoming embedded into undergraduate medical curricula with many publications now describing its various modes and student self-reported impacts. This systematic review synthesizes the available literature for evidence of performance effects of SBME as an adjunct within traditional teaching programmes.

Methods: A narrative systematic review was conducted according to PRISMA guidelines using Ovid MEDLINE, EMBASE, and PubMed databases for studies, published in English, reporting on general medical and surgical undergraduate SBME between 2010 to 2020. Two reviewers independently assessed potential studies for inclusion. Methods and topics of simulation with their assessments were evaluated. Descriptive statistics were used to describe pooled student cohorts.

Results: 3074 articles were initially identified using the search criteria with 92 full-text articles then screened for eligibility. Nineteen articles, including nine randomised trials, concerning 2459 students (median 79/study), were selected for review. Cardiac scenarios were commonest (n=6) with three studies including surgical topics. Nine studies used mannequin simulators (median time/session 17.5minutes) versus standardised patients in seven (median time/session=82 minutes). Educational impact was measured by written (n=10), checklist (n=5) and OSCEs (n=3) assessment either alone or in combination (n=1, OSCE/written assessment). All articles reported a positive effect of SBME on knowledge including improved retention in three.

Conclusion: SBME, as an adjunct to existing curricula, improves knowledge-based performance of medical students at least in the short-term. Future studies should broaden its topics, assess longer term impacts and cost-effectiveness while also considering whether and what areas of traditional undergraduate learning it can replace.

Keywords: undergraduate, medical education, simulation, performance


Introduction

Medical education largely still follows traditional structures [1], [2]. Students undergo didactic lecture-based learning throughout their studies but especially in their early years [3]. Once on clinical sites, they learn by engaging with clinical teams and real patients. Although proven sufficient over time, the acquisition of medical knowledge and skills in this way has a number of potential pitfalls. Students are expected to learn from and practice recently acquired knowledge and skills on actual patients. This interaction may be complicated both ways– medical students can be nervous and patients, a vulnerable cohort, can be fearful. Furthermore, the clinical experience may be variable between teams and over time and much of the interaction happens without direct observation by academic faculty. The heterogeneity and inconsistency of clinical exposure coupled with lack of assessment-relevant feedback before examinations is suboptimal and may undermine fairness in competitive assessments and standards in future medical practice. Also, the COVID-19 pandemic has greatly challenged medical undergraduate programmes and students by the withdrawal of ward-based placements.

To improve their skills in both history taking and physical examinations, students have long practiced individually and with their peers. This provides a safe, comfortable environment for them to hone their skills and enables iterative improvement by doing, although this again lacks senior supervision and standardisation and may not challenge. Simulation has long been used in aviation and military training and is fast becoming a formal component of undergraduate and postgraduate medical education [4], [5], [6]. Simulation training providing a “device that presents a simulated patient (or part of a patient) that interacts appropriately with the actions taken by the simulation participant” allows users to learn in a safe, controlled and standardised environment, so that skills and knowledge can be applied and practiced [7]. Recent technological advances have increased the capability to realistically mimic actual patients and real-life clinical scenarios [8], [9].

While postgraduate simulation has been studied extensively in the literature (with some of its proven benefits including greater patient safety, improved teamwork and enhanced confidence [10], [11], [12], [13], [14], [15]), there is less evidence detailing the effect of such training on undergraduate medical student performance. Although many previous studies have detailed the self-reported effects of simulation training on medical students [10], [16], [17], objectively assessed impacts need to be established prior to its broad implementation most particularly to justify the necessary expenditure but also especially if it’s to replace other existing curricular components, either by design or necessity (eg due to public health emergencies). Crucial to any education innovation is assurance of benefit. The primary metric of performance in medical education are assessment scores. The purpose of this review is to synthesise the available evidence on the performance effect of simulation based medical education (SBME) as applied to technical, procedural and examination skills for undergraduate medical students in general medicine and surgery.


Methods

Search database

This systematic review was conducted using the Preferred Reporting Items for Systematic Review & Meta-Analyses (PRISMA) guidelines. The searches were performed independently in duplicate (NM, FK) using OVID, EMBASE and PubMed databases from 2010 to 2020 inclusive. The final search was completed in January 2021. All eligible records were screened independently (NM, FK) for relevance.

Search terms

The following Medical Subject Headings terms and keywords were used: medical education OR medical students OR medical student AND simulation training OR high fidelity simulation training OR mannequin OR manikins OR SimMan or simulation. Boolean AND/OR operators were used to combine MeSH and terms and keywords. Following the search, titles and abstracts were screened. Full text of potentially eligible articles were reviewed by two authors (NM, FK) independently and eligible studies selected.

Inclusion/exclusion criteria

Our inclusion criteria required all articles to be in English, to be research-based and to include objective investigation of the efficacy of a simulation-based training programme for undergraduate medical students in general medical and surgical clinical learning stages regarding clinical performance. Studies which assessed history taking, physical examination, and clinical practice either separately or together were included regardless of method of simulation (including whether mannequins or simulated patients were used) and assessment (i.e. whether written, Objective Structured Clinical Examination (OSCE) or tutor assessment). OSCEs in the included students followed the traditional format described by Harden [18].Studies assessing procedural protocols (including BLS, ACLS and ATLS) and surgical skills were excluded. Checklist assessments were included but studies applying exclusive subjective assessment methodology (eg overall global rating) were excluded. Studies assessing subspecialty domains such as obstetric, paediatric, anaesthetic and psychiatric simulation were excluded.

Data collection

The following data was extracted from each included publication: first author, publication year, country, study design, number and stage of students, simulation method, assessment method, performance levels pre and post simulation, effects on knowledge retention, students’ confidence and authors’ conclusion.


Results

Study characteristics

Figure 1 [Fig. 1] shows the PRISMA flowchart of search and selection process. 3074 studies were identified through database searching. Following removal of duplicates, the abstracts of 2716 studies were evaluated and included if deemed suitable. The full text of 92 articles were assessed for eligibility, and ultimately 19 articles were included for qualitative synthesis.

Table 1 [Tab. 1] summarizes the final nineteen included studies comprising 2459 students ranging from first to final year medical students. Studies from eleven countries were included with three focusing on first (n=2) or second year (n=1) students and the remainder concerning those in later years. There were nine randomised control trials [19], [20], [21], [22], [23], [24], [25], [26], [27], six prospective cohort studies [28], [29], [30], [31], [32], [33], two crossover studies [34], [35], one retrospective analysis [36], and one case-control study [37]. The median number of patients per study was 79 (20-615 patients). Due to heterogeneity of study type including the variety of tools used for assessing multiple skills, meta-analysis was prohibited. Attachment 1 [Attach. 1] summarizes the results from each study.

Specialities

Medical scenarios were the most common topics for simulation with fourteen groups [28], [36], [30], [31], [22], [32], [24], [34], [33], [25], [26], [35], [27], [37], conducting simulation training based around common medical pathologies. Of these, six groups simulated cardiac scenarios with four of these assessing auscultation skills [20], [30], [34], [27] and two simulating acute cardiac presentation scenarios [32], [26]. Surgical topics were simulated in two groups [19], [29] while one group [23] used both medical and surgical scenarios for their simulation sessions.

Methods of simulation

Various different methods of patient simulation were used with the majority (n=12) using artificial patient models in scenarios to mimic the medical setting lasting a median of 17.5 minutes (range 15-30 mins) [28], [21], [36], [30], [31], [22], [24], [34], [33], [35], [27], [37]. SimManTM (Laerdal) was the most commonly used simulator (n=6) [36], [30], [24], [34], [35], [27]. Harvey (n=2) [33], [37], METI (n=1) [22] and Kyota kagaku (n=1) along with a heart-sound simulator (n=1) were the other artifical simulators used. SimMan is a wireless, life-sized advanced patient simulator, that can display physiological changes that the “patient” undergoes in real-time on a monitor, under the control of the simulation facilitator [https://laerdal.com/us/products/simulation-training/emergency-care-trauma/simman/]. In all six groups who used SimMan, general medical scenarios involved students taking a history and performing a physical examination [36], [30], [24], [34], [35], [27]. In these, the SimMan displayed abnormal cardiovascular and respiratory signs based on the simulated scenario. In two studies [30], [27], students were given a short orientation (15-30 minute) to clinical practice using a SimMan.

Standardized patients were used in seven studies [19], [20], [29], [23], [32], [25], [26], three of which [20], [29], [23] focussed on assisting clinical examination practice. Standardized patients methods ranged from actors [19], [36], [26] or academic staff mimicking learned symptoms to expert patients and focused on standardised patient histories [25] and ward rounds [19] with supplementary material such as drugs charts, patients’ vital parameters and end of bed notes being made available to students in all studies. The median time for simulation with standardized patients was 82.5 minutes (15-180 minutes). Giblett et al. [24] cumulatively spent 21 hours over the course of one semester simulating encounters with standardized patients. Giblett et al. [29] and Nassif [20] used standardized patients in conjunction with breast models to educate students on breast examination.

Methods of assessment

Various methods of assessments were used to evaluate the performance effect of simulation training. Written assessment was the most common (n=10) [28], [29], [31], [22], [32], [24], [34], [33], [25], [26], [35], predominantly comprising of an MCQ examination. Four groups [19], [23], [25], [36] used a checklist assessment, which was either completed during or after the simulation scenario. Three groups [20], [23], [37] assessed their students using OSCE examinations alone. One group [21] used a combination of OSCE and written assessment.

Effect on performance

All groups reported a performance benefit to students associated with simulation training. Simulation training was shown to have a positive effect when used across a broad range of medical and surgical specialities, in acute and non-acute scenarios.

Auscultation simulation

Swamy [35] reported improved results on a knowledge-based questionnaire following clinical chest examination training with SimManTM compared to examining their student colleagues, later confirming these results in a further larger cohort [24]. There was also improvement noted in the simulation group’s self-perceived confidence. In a cross-over trial, at the mid-test point, the group who performed examinations on mannequin performed significantly higher on a knowledge assessment then those who performed peer examinations. Perlini [32] also demonstrated the impact simulation training has on retention of knowledge, focusing on cardiac auscultation. After three years, a subgroup of his students were reassessed. Without any further exposure to the Harvey simulation over that timeframe, retention of the acquired capability was maintained. Pereira [31] also showed a positive effect of simulation on cardiac auscultation. When comparing pre and post test scores, there was a 16% performance improvement when simulation training is added to the existing curriculum. Equally, Bernardi [28] demonstrated an improvement in cardiac auscultation skills when practiced on a simulator. There was however no improvement in respiratory auscultation between the simulation and control group. Kern et al. [37] implemented a cardiac auscultation programme following previous reports of deficiencies in physician’s clinical examination skills [38], [39]. In this, students who received simulation training (using the Harvey simulator) along with the standard curriculum were compared to students who received the standard curriculum alone. To ensure little variation in teaching between the groups, the same three faculty teachers facilitated teaching in the same facility for all students. Students were assessed in a multi-station OSCE five weeks following their respective learning. Students who received simulation training performed significantly better in the respective assessed cardiac skills than the control group. Again, there was no difference in pulmonary examination skills.

Breast examination simulation

Nassif [20] used a hybrid simulation model of breast examination where a standardized patient wearing a silicone breast simulator jacket was examined. This group was compared to students who examined a standardized tabletop breast model. Following this intervention, both groups were assessed in an OSCE. Students who participated in hybrid simulation training were significantly better at lesion reporting, identification of malignant features and accurate location identification compared to the group who received traditional teaching. Angarita [21] also evaluated the effect of simulation training on students’ clinical breast examinations. Students were taught using a simulation and multimedia-based curriculum, which was compared to the traditional didactic lecture and clinic-based teaching. Both groups were assessed using written and OSCE assessments. The group who completed the simulation-based training were significantly better at all aspects of the breast exam (including inspection, position, palpation, pressure, axillary exam and providing justifications for performing a breast exam). Additionally students who underwent simulation training were significantly more confident than their peers who were taught with traditional methods. Alluri [22] used simulation to teach pre-clinical medical students, and assessed the effect in a randomised, controlled cross-over study finding that both simulation and didactic lectures improved student knowledge when assessed on an MCQ. When assessing delayed test scores, thus evaluating retention of knowledge, students who completed simulation training demonstrated improvement, those who were taught didactically did not.

Simulation of emergency scenarios

Vattanavanit [36] assessed sixth year medical students knowledge and confidence in septic shock resuscitation. Students who received simulation training improved significantly in knowledge and resuscitation skills, whilst also improving their confidence at assessing patients in septic shock (Post-simulation 68.1%±12.2% vs pre-simulation 5.64±13.1, p<0.001). Solymos [24] looked at the area of critical medicine, comparing simulation-based teaching to traditional didactic teaching. Final year students were evaluated using a multiple-choice questionnaire – at baseline, post-teaching and a two week follow up. Although there was a significant improvement following simulation compared to the didactic lecture group, baseline scores were higher in the didactic lecture group. McCoy [27] performed a cross-over study, particularly focusing on assessing critically unwell patients with myocardial infarction or anaphylaxis. Simulation training was compared to traditional didactic lectures. Students’ performance was evaluated in real-time during the simulation. 96% of students performed better when trained with simulation. Overall, simulation training resulted in a 22% absolute increase in scores (95% CI 18-26%). History taking (27% absolute increase in score), physical examinations (26%) and patient management (16%) components of the assessment were higher in the simulation group compared to the lecture group. DeWaay [26] investigated fourth year medical student performance in students who received simulation training compared to a control group (who received no intervention) and to a group who received didactic lectures. Simulation significantly improved overall performance. The percentage of correct answers in the simulation group was 53.5±8.9% compared to 47.9±9% in the didactic teaching group and 47.9±9.8% in the control group (p<0.001). Williams [32] also simulated cardiac emergencies, but this time using real patients with a cardiac history taking the simulated patient role. Students were assessed using knowledge based short answer questions. Mean scores increased (25/43 to 34/43) after the intervention. A sustained effect was seen at one week post intervention, with scores of 35/43. Students self-perceived confidence was also improved post intervention. Sanchez-Ledesma [30] focused on the use of simulation training in the management of neurological emergencies. The simulation instructor evaluated students’ during the simulation session. Once again, statistically significant difference were found between pre and post-test groups with results improving further following repeated simulation sessions.

Simulation in non-emergent scenarios

Simulation training was not limited to acute medical presentations. Fisher [25] developed and delivered a simulation programme dealing with common geriatric issues, including delirium, falls and elder abuse. Mannequins and simulated patients were both incorporated into the scenarios. Students were assessed pre, post and one month post-simulation. Test scores were compared to those who underwent traditional didactic teaching with post simulation test scores being better than pre-simulation test scores. For all scenarios, there was a statistically significant difference between the simulation and control group (p<0.005). Students in Zhang et al. [23] study were pre-selected into a simulation and didactic lecture group by virtue of variants in facility across their clinical sites. Students simulated both medical and surgical scenarios. Across two year groups, the mean score for 16 OSCE stations was significantly better in those who had undergone simulation training. The mean score in 2013 for the simulation was 80.95±0.61 versus 69.91±1.24 for the didactic lecture group (p=0.0114), and 86.12±0.56 versus 73.58±1.34 in 2014 (p=0.006).

Simulation in surgical education

Two studies specifically examined simulation in surgery. Giblett [29] randomised two groups of medical students in their first year of clinical attachments. In the first semester one group received traditional didactic lecture-based education while the other group received simulation training, broadly covering the surgical curriculum. Using independent t-test analysis, a significant performance benefit in a knowledge-based assessment was seen amongst the group who received simulation training (p<0.001). Additionally, the simulation group had higher self-reported confidence and understanding of surgical principles. These students also showed substantially improved confidence in acute surgical assessments, particularly in abdominal (p<0.001), vascular (p<0.001) and breast examinations (p<0.001). Grunewald [19] used an objective surgical ward round assessment tool to evaluate students’ performance. The control group did not receive simulation training. Competence in the intervention group improved from 62.6 to 69.6 points (p=0.0169). In contrast, there was no improvement in the control group (pre: 62.6 vs post 69.6 points (p=0.72)).


Discussion

SBME is of increasing interest for medical undergraduate programmes. This has been especially the case recently with the COVID-19 pandemic pressurising clinical placements with added emphasis on graduating competent doctors in a timely fashion and indeed even early. The primary outcome of this study was to examine for evidence of performance effect of simulation training on medical student performance through a synthesis of the published literature including summarising the methods used to provide simulation training and the tools used to assess efficacy. As evidenced by this review, simulation training in tandem with the traditional curriculum has been shown to generally improve medical students' performance and knowledge retention alongside confidence over didactic teaching and learning through observation. These benefits can be seen across a number of required skills including core components such as history taking and physical examination (including essential, intimate physical examinations, such as a breast examination which can be otherwise challenging for student to learn) and across various specialities in both emergency and elective general medical and surgical situations. Furthermore, students who experienced simulation training have been found to be more satisfied with their teaching [29]. These findings shouldn’t perhaps be surprising as students learn best when they are actively involved [40]. While medical training has traditionally utilised the adage “see one, do one, teach one”, simulation-based education provides the opportunity to “do one” repeatedly, safely and under supervision to improve future practice.

SBME of course requires some investment in terms of teaching personnel, equipment and space meaning objective proof of its usefulness is very important to justify the expenditure. Additionally nuance exists. Hamstra [41] detailed some of the key components to effectively run simulation scenarios. Learner engagement and a suspension of disbelief enhance the learning environment for medical students. By placing them in scenarios and an environment that mimics real life, a superior educational experience can be obtained. Also some studies have suggested that improved student confidence may be a negative finding [42] indicating that further work needs to be done in this specific area. Furthermore, while simulation training as been shown to improve cardiac auscultation skills, it seems to have no effect on respiratory auscultation skills [28], [31], [33], [37]. Bernardi [28] hypothesised that this difference is related to the different teaching methods employed for each and that using graphic representation of the lung sounds heard may offset this. Further, additional “real-world” validity can be added when constructing the scenarios (for example, Williams simulated cardiac emergencies with real patients who had recovered following a previous emergency cardiac presentation including intermittent interruption of the students to simulate a real life “on call scenario” as doctors are often required to multi-task, manage their time efficiently and remain calm under pressure [23]).

As much as simulation training facilitates the standardization of medical education, in allowing all students access similar clinical experiences, it would seem also to provide a useful means of contributing to student summative assessment in a manner that is reproducible and objective. To date, written examinations in conjunction with observed clinical examination and skills assessment have traditionally been major components of medical students' assessment [43], [44]. Recently, two studies have indicated that a simulation-based assessment may be appropriate for assessing clinical competence [45], [46]. In addition, the healthcare educator’s primary function is to produce competent and proficient doctors and the physical and mental wellbeing of our students is increasingly recognised as essential given the increasing rates of burnout and mental health issues being reported amongst medical students [47], [48]. A consensus statement on medical student wellbeing from the Australia and New Zealand [49] recommends “curricula that promote peer support and progressive levels of challenge to students and to employ strategies to promote positive outcomes from stress and to help others in need”. These strategies are already components of SBME and further aspects such as resilience training can be readily incorporated. Another area to examine further relates to whether confidence improvement by simulation can help ease the transition from medical student to junior doctor.

In conclusion, this systematic review provides evidence that SBME can improve medical students’ performance in a variety of domains and specialities while also identifying areas in need of future address. Alongside performance benefits in history taking and physical examination, there is evidence to show that SBME leads to greater knowledge retention and confidence. Therefore this review validates the use of SBME as an adjunct to the traditional didactic lecture-based curriculum. For the purposes of ensuring optimal education of medical students, further studies could investigate the best methodologies for SBME by comparison and whether simulation is best employed as an adjunct or replacement to the traditional lecture-based curriculum. It is important also to examine cost-effectiveness especially the role of lower cost set-ups versus more expensive systems. Ultimately too it is important to correlate the performance effect of simulation training directly to competence. By building up such an evidence-base we will best evolve the curriculum for the purpose of producing better doctors, and most importantly, better patient outcomes.


Limitations

This systematic review studies heterogenous groups consisting of various methods of simulation and assessment. As such meta-analysis was prohibited.


Acknowledgements

The authors would like to thank Angela Rice, Library and Information Services, Mater Misericordiae University Hospital for her guidance during this project.


Competing interests

The authors declare that they have no competing interests.

Professor Ronan Cahill is named on a patent filed in relation to processes for visual determination of tissue biology, receives speaker fees from Stryker Corp and Ethicon/J&J, research funding from Intuitive Corp and Medtronic and holds research funding from the Irish Government (DTIF) in collaboration with IBM Research in Ireland and from EU Horizon 2020 in collaboration with Palliare.


References

1.
Norman G. Medical education: past, present and future. Perspect Med Educ. 2012;1(1):6-14. DOI: 10.1007/s40037-012-0002-7 External link
2.
Buja LM. Medical education today: All that glitters is not gold. BMC Med Educ. 2019;19(1):110. DOI: 10.1186/s12909-019-1535-9  External link
3.
Flexner A. Medical education in the United States and Canada. From the Carnegie Foundation for the Advancement of Teaching, Bulletin Number Four, 1910. Bull World Health Organ. 2002;80(7):594-602.
4.
Roberts KH. Some Characteristics of One Type of High Reliability Organization. Organ Sci. 1990;1(2):160-176. DOI: 10.1287/orsc.1.2.160 External link
5.
Rochlin G, La Porte T, Roberts K. The self-designing high-reliability organization: Aircraft carrier flight operations at sea. Nav War Coll Rev. 1998;51(3):97.
6.
McGaghie WC, Issenberg SB, Petrusa ER, Scalese RS. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010;44(1):50-63. DOI: 10.1111/j.1365-2923.2009.03547.x External link
7.
Gaba DM. The future vision of simulation in health care. Qual Saf Heal Care. 2004;13(Suppl 1):2-10. DOI: 10.1136/qhc.13.suppl_1.i2 External link
8.
Maran NJ, Glavin RJ. Low- to high-fidelity simulation - A continuum of medical education? Med Educ. 2003;37 Suppl 1:22-28. DOI: 10.1046/j.1365-2923.37.s1.9.x External link
9.
Willaert WI, Aggarwal R, Herzeele I Van, Cheshire NJ, Vermassen FE. Recent advancements in medical simulation: Patient-specific virtual reality simulation. World J Surg. 2012;36(7):1703-1712. DOI: 10.1007/s00268-012-1489-0 External link
10.
Morris MC, Conroy P. Development of a simulation-based sub-module in undergraduate medical education. Ir J Med Sci. 2020;189(1):389-394. DOI: 10.1007/s11845-019-02050-3 External link
11.
Paskins Z, Peile E. Final year medical students’ views on simulation-based teaching: A comparison with the Best Evidence Medical Education Systematic Review. Med Teach. 2010;32(7):569-577. DOI: 10.3109/01421590903544710 External link
12.
Subramanian A, Timberlake M, Mittakanti H, Lara M, Brandt ML. Novel educational approach for medical students: Improved retention rates using interactive medical software compared with traditional lecture-based format. J Surg Educ. 2012;69(2):253-256. DOI: 10.1016/j.jsurg.2012.05.013 External link
13.
Khan K, Pattison T, Sherwood M. Simulation in medical education. Med Teach. 2011;33(1):1-3. DOI: 10.3109/0142159X.2010.519412 External link
14.
Riaz S. How Simulation-Based Medical Education Can Be Started In Low Resource Settings. J Ayub Med Coll Abbottabad. 2019;31(4):636-637.
15.
Borggreve AS, Meijer JM, Schreuder HW, ten Cate O. Simulation-based trauma education for medical students: A review of literature. Med Teach. 2017;39(6):631-638. DOI: 10.1080/0142159X.2017.1303135 External link
16.
Hogg G, Miller D. The effects of an enhanced simulation programme on medical students’ confidence responding to clinical deterioration. BMC Med Educ. 2016;16:161. DOI: 10.1186/s12909-016-0685-2 External link
17.
Nitschmann C, Bartz D, Johnson NR. Gynecologic Simulation Training Increases Medical Student Confidence and Interest in Women’s Health. Teach Learn Med. 2014;26(2):160-163. DOI: 10.1080/10401334.2014.883984 External link
18.
Harden RMG, Downie WW, Stevenson M, Wilson GM. Assessment of Clinical Competence using Objective Structured Examination. Br Med J. 1975;1(5955):447-451. DOI: 10.1136/bmj.1.5955.447 External link
19.
Grünewald M, Klein E, Hapfelmeier A, Wuensch A, Berberat PO, Gartmeier M. Improving physicians’ surgical ward round competence through simulation-based training. Patient Educ Couns. 2020;103(5):971-977. DOI: 10.1016/j.pec.2019.11.029 External link
20.
Nassif J, Sleiman AK, Nassar AH, Naamani S, Sharara-Chami R. Hybrid Simulation in Teaching Clinical Breast Examination to Medical Students. J Cancer Educ. 2019;34(1):194-200. DOI: 10.1007/s13187-017-1287-3 External link
21.
Angarita FA, Price B, Castelo M, Tawil M, Ayala JC, Torregrossa L. Improving the competency of medical students in clinical breast examination through a standardized simulation and multimedia-based curriculum. Breast Cancer Res Treat. 2019;173(2):439-445. DOI: 10.1007/s10549-018-4993-6 External link
22.
Alluri RK, Tsing P, Lee E, Napolitano J. A randomized controlled trial of high-fidelity simulation versus lecture-based education in preclinical medical students. Med Teach. 2016;38(4):404-409. DOI: 10.3109/0142159X.2015.1031734 External link
23.
Zhang MY, Cheng X, Xu AD, Luo LP, Yang X. Clinical simulation training improves the clinical performance of Chinese medical students. Med Educ Online. 2015;20:28796. DOI: 10.3402/meo.v20.28796 External link
24.
Solymos O, O’Kelly P, Walshe CM. Pilot study comparing simulation-based and didactic lecture-based critical care teaching for final-year medical students. BMC Anesthesiol. 2015;15(1):6-10. DOI: 10.1186/s12871-015-0109-6 External link
25.
Fisher JM, Walker RW. A new age approach to an age old problem: Using simulation to teach geriatric medicine to medical students. Age Ageing. 2014;43(3):424-428. DOI: 10.1093/ageing/aft200 External link
26.
DeWaay DJ, McEvoy MD, Kern DH, Alexander LA NP. Simulation curriculum can improve medical student assessment and management of acute coronary syndrome during a clinical practice exam. Am J Med Sci. 2014;347(6):452-456. DOI: 10.1097/MAJ.0b013e3182a562d7 External link
27.
McCoy CE, Menchine M, Anderson C, Kollen R, Langdorf MI, Lotfipour S. Prospective randomized crossover study of simulation vs. didactics for teaching medical students the assessment and management of critically ill patients. J Emerg Med. 2011;40(4):448-455. DOI: 10.1016/j.jemermed.2010.02.026 External link
28.
Bernardi S, Giudici F, Leone MF, Zuolo G, Furlotti S, Carretta R, Fabris B. A prospective study on the efficacy of patient simulation in heart and lung auscultation. BMC Med Educ. 2019;19(1):275. DOI: 10.1186/s12909-019-1708-6 External link
29.
Giblett N, Rathore R, Carruthers D. Simulating the Surgical Patient Pathway for Undergraduates. J Surg Educ. 2017;74(2):271-276. DOI: 10.1016/j.jsurg.2016.10.003 External link
30.
Sánchez-Ledesma MJ, Juanes JA, Sáncho C, Alonso-Sardón M, Gonçalves J. Acquisition of Competencies by Medical Students in Neurological Emergency Simulation Environments Using High Fidelity Patient Simulators. J Med Syst. 2016;40(6):139. DOI: 10.1007/s10916-016-0496-3 External link
31.
Pereira D, Gomes P, Faria S, Cruz-Correia R, Coimbra M. Teaching cardiopulmonary auscultation in workshops using a virtual patient simulation technology - A pilot study. Vols. 2016-Octob, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Piscataway (NJ): IEEE EMBS; 2016. p.3019-3022.
32.
Williams H, Yang L, Gale J, Paranehewa S, Joshi A, Westwood M, Weerackody R. Simulation of cardiac emergencies with real patients. Clin Teach. 2015;12(5):341-345. DOI: 10.1111/tct.12322 External link
33.
Perlini S, Salinaro F, Santalucia P, Musca F. Simulation-guided cardiac auscultation improves medical students’ clinical skills: The Pavia pilot experience. Intern Emerg Med. 2014;9(2):165-172. DOI: 10.1007/s11739-012-0811-z External link
34.
Swamy M, Sawdon M, Chaytor A, Cox D, Barbaro-Brown J, McLachlan J. A study to investigate the effectiveness of SimMan® as an adjunct in teaching preclinical skills to medical students. BMC Med Educ. 2014;14:231. DOI: 10.1186/1472-6920-14-231 External link
35.
Swamy M, Bloomfield TC, Thomas RH, Singh H, Searle RF. Role of SimMan in teaching clinical skills to preclinical medical students. BMC Med Educ. 2013;13(1):13-18. DOI: 10.1186/1472-6920-13-20 External link
36.
Vattanavanit V, Kawla-Ied J, Bhurayanontachai R. High-fidelity medical simulation training improves medical students’ knowledge and confidence levels in septic shock resuscitation. Open Access Emerg Med. 2017;9:1-7. DOI: 10.2147/OAEM.S122525 External link
37.
Kern DH, Mainous AG, Carey M, Beddingfield A. Simulation-based teaching to improve cardiovascular exam skills performance among third-year medical students. Teach Learn Med. 2011;23(1):15-20. DOI: 10.1080/10401334.2011.536753 External link
38.
Mangione S, Nieman LZ. Cardiac auscultatory skills among internal medicine and family practice trainees: A comparison of diagnostic proficiency. JAMA. 1997;278(9):717-722. DOI: 10.1001/jama.1997.03550090041030 External link
39.
Reilly BM. Physical examination in the care of medical inpatients: An observational study. Lancet. 2003;362(9390):1100-1105. DOI: 10.1016/S0140-6736(03)14464-9 External link
40.
Beech DJ, Domer FR. Utility of the case-method approach for the integration of clinical and basic science in surgical education. J Cancer Educ. 2002;17(3):161-164. DOI: 10.1080/08858190209528825 External link
41.
Hamstra SJ, Brydges R, Hatala R, Zendejas B, Cook DA. Reconsidering fidelity in simulation-based training. Acad Med. 2014;89(3):387-392. DOI: 10.1097/ACM.0000000000000130  External link
42.
Massoth C, Röder H, Ohlenburg H, Hessler M, Zarbock A, Pöpping DM, Wenk M. High-fidelity is not superior to low-fidelity simulation but leads to overconfidence in medical students. BMC Med Educ. 2019;19(1):29. DOI: 10.1186/s12909-019-1464-7 External link
43.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. DOI: 10.1097/00001888-199009000-00045 External link
44.
Epstein RM, Hundert EM. Defining and Assessing Professional Competence. JAMA. 2002;287(2):226-235. DOI: 10.1001/jama.287.2.226 External link
45.
Adjedj J, Ducrocq G, Bouleti C, Reinhart L, Fabbro E, Elbez Y, Fischer Q, Tesniere A, Feldmann L, Varenne O. Medical student evaluation with a serious game compared to multiple choice questions assessment. JMIR Serious Games. 2017;5(2):e11. DOI: 10.2196/games.7033 External link
46.
Fonteneau T, Billion E, Abdoul C, Le S, Hadchouel A, Drummond D. Simulation game versus multiple choice questionnaire to assess the clinical competence of medical students: Prospective sequential trial. J Med Internet Res. 2020;22(12):e23254. DOI: 10.2196/23254 External link
47.
Ishak W, Nikravesh R, Lederer S, Perry R, Ogunyemi D, Bernstein C. Burnout in medical students: A systematic review. Clin Teach. 2013;10(4):242-245. DOI: 10.1111/tct.12014 External link
48.
Dyrbye L, Shanafelt T. A narrative review on burnout experienced by medical students and residents. Med Educ. 2016;50(1):132-149. DOI: 10.1111/medu.12927 External link
49.
Kemp S, Hu W, Bishop J, Forrest K, Hudson JN, Wilson I, Teodorczuk A, Rogers GD, Roberts C, Wearn A. Medical student wellbeing - A consensus statement from Australia and New Zealand. BMC Med Educ. 2019;19(1):69. DOI: 10.1186/s12909-019-1505-2 External link
50.
Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gřtzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100. DOI: 10.1371/journal.pmed.1000100 External link
51.
PRISMA. PRISMA 2009 Flow German. Zugänglich unter/available from: https://www.prisma-statement.org/Translations/Translations External link