gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

Formative key feature examinations as innovative teaching approach in dental education: A project report

article formative assessment

  • corresponding author Tim Becker - University Medical Center Göttingen, Study Deanery of the Medical Faculty, Division of Medical Education, Göttingen, Germany
  • author Marc André Ackermann - University Medical Center Göttingen, Department of Oral and Maxillofacial Surgery, Göttingen, Germany
  • author Sabine Sennhenn-Kirchner - University Medical Center Göttingen, Department of Oral and Maxillofacial Surgery, Göttingen, Germany

GMS J Med Educ 2024;41(4):Doc39

doi: 10.3205/zma001694, urn:nbn:de:0183-zma0016942

This is the English version of the article.
The German version can be found at: http://www.egms.de/de/journals/zma/2024-41/zma001694.shtml

Received: December 22, 2023
Revised: May 2, 2024
Accepted: June 11, 2024
Published: September 16, 2024

© 2024 Becker et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.


Abstract

Introduction: Clinical reasoning ability is one of the core competencies of physicians. It should already be trained during undergraduate medical education. At University Medical Center Göttingen (UMG), medical students can participate in formative key feature examinations in which they work on virtual patient cases in order to apply and deepen the procedural knowledge acquired in lectures and seminars.

Problem and objective: While this teaching format is already established in the medical curriculum at the UMG, it has not yet been implemented in the dental curriculum at the same institution. Therefore, the current project aimed to evaluate the feasibility of formative key feature examinations in dental education.

Methods: In 2022, new key feature cases focusing on dental-surgical teaching content were created. For pilot testing, the new cases were worked on by two cohorts of dental students via an online learning platform in February 2023. The students were also asked to complete an anonymous online questionnaire in order to evaluate the new teaching format.

Results: Overall, the formative key feature examinations were evaluated positively by the dental students, and they demanded for further dental key feature cases. However, descriptive analyses of item characteristics as well as students’ comments in the questionnaire revealed some potential for improvements, so that a few cases were partly revised afterwards.

Conclusion and outlook: This project shows that formative key feature examinations are feasible in dental education and that dental students can benefit from working on virtual case scenarios. Whether dental students’ clinical reasoning competence can be improved by completing formative key feature examinations is being investigated in an ongoing study at the UMG.

Keywords: dental education, dental licensing regulations, key feature cases, formative examination, test-enhanced learning, clinical reasoning, clinical decision-making, virtual patients, digital teaching, e-learning


Introduction

Theoretical background

One of the core competencies of physicians is the ability to arrive at correct diagnoses and treatment recommendations based on the results of appropriate history taking and diagnostic testing. Such complex cognitive processes are referred to as “clinical reasoning” and they should already be trained during undergraduate medical education [1], [2].

One effective teaching format to improve clinical reasoning competence is case-based learning in which medical students are faced with clinical problems using specific case examples [3]. However, case-based learning in small groups with real patients is resource-intensive and not easy to standardise [4]. Therefore, computer-assisted case-based learning using virtual patients is a suitable alternative since various digital case scenarios can be worked on by large groups of students in order to train clinical reasoning processes [5], [6].

Students’ performance regarding clinical reasoning can be assessed using so-called key feature examinations: within this test format, students are exposed to clinical case scenarios and they have to answer questions that focus on the critical steps (namely the key features) of these cases, such as diagnostic or therapeutic procedures [7], [8]. However, key feature examinations can be used not only for assessing but even for improving students’ clinical reasoning competence: this approach of test-enhanced learning is based on the so-called testing effect suggesting that repeated retrieval of memorised content (e.g. by taking formative tests) can stimulate cognitive processes and improve long-term retention of the retrieved content [9], [10].

Formative key feature examinations at University Medical Center Göttingen

With this in mind, a specific teaching format was implemented at University Medical Center Göttingen (UMG) in the clinical phase of the medical curriculum in 2013: In weekly computer-assisted seminars, students can participate in formative key feature examinations in order to apply and deepen the procedural knowledge they gained in previous lectures and seminars. This specific teaching format has been accompanied by research, and several studies have shown that medical students’ clinical reasoning competence can be improved by completing formative key feature examinations [4], [11], [12], [13].

Objectives of the present project

There is a need to train clinical reasoning processes not only in medical education but also in dental education [14]. However, while the teaching format of formative key feature examinations is already established in the medical curriculum at the UMG, it has not yet been implemented in the dental curriculum at the same institution. Therefore, the present project aimed to create new dental key feature cases and to evaluate their feasibility in dental education at the UMG. Another objective was to define a standardised procedure for creating further dental key feature cases. Overall, dental students should have the possibility to benefit from formative key feature examinations in the same way as medical students do at the UMG.


Methods

Creating dental key feature cases

In 2022, six initial key feature cases focusing on dental-surgical teaching content were created in interdisciplinary cooperation between the Division of Medical Education and the Department of Oral and Maxillofacial Surgery at the UMG. The step-by-step procedure for creating the new cases was based on the practical guide published by Kopp et al. [15]. However, it was complemented by repeated reviews and revisions by experts of the two institutions mentioned above (see figure 1 [Fig. 1]). This iterative process of creating cases was intended to ensure the highest possible quality of the dental key feature cases even before piloting. After finalisation, the new cases were transferred into the online learning platform ILIAS which is used for providing digital teaching resources at the UMG.

Piloting and evaluating the new cases

For pilot testing, three of the six new key feature cases were worked on by two cohorts of fourth year undergraduate dental students in computer-based seminars in February 2023. Additionally, the students were asked to evaluate the new teaching format by completing an online questionnaire using the evaluation software EvaSys. The anonymous questionnaire contained closed items (that could be rated on six-point scales, see figure 2 [Fig. 2]) as well as open-ended questions. Data collection and processing were approved by the ethics committee of the UMG (application no. 15/1/23). The students provided written consent to participate in the project.

After the students had completed working on the cases, data were exported from ILIAS and analysed anonymously. Descriptive analyses included students’ login times (i.e. the time required for completing the cases) as well as the percentages of correct and incorrect answers to the individual key feature questions. Additionally, analyses of item difficulty and item discrimination indices were performed [16] in order to assess the quality of the individual questions and the entire key feature cases and, if necessary, to make revisions. Data from the evaluation questionnaire were exported from EvaSys and analysed descriptively as well.


Results

Results of the key feature cases

The dental key feature cases were completed by 71 of the eligible 79 students (=89.9%). The average time required for completing the three cases was 15:56 (±5:21) minutes, and the students achieved a mean of 11.8 (±2.7) out of 16 points. Analyses of item characteristics showed that most of the key feature questions had adequate difficulty and discrimination indices. For two questions, however, the item characteristics as well as wide ranges of students’ answers indicated that these questions had not been formulated precisely enough and therefore needed to be revised. As an example, table 1 [Tab. 1] shows the item difficulty and discrimination indices for one of the three key feature cases (which is attached to this project report as attachment 1 [Attach. 1]).

Results of the evaluation questionnaire

The evaluation questionnaire was completed anonymously by 65 of the 71 students (=91.5%). Their ratings on the closed items showed, for example, that providing the key feature cases via the online platform worked well technically, and that working on the cases contributed to deal with dental teaching content and clinical decisions, and that students would appreciate the enhancement of the new teaching format (see figure 2 [Fig. 2]). Comments to the open-ended questions also showed students’ overall positive evaluation of the dental key feature cases as well as their demand for enhancing the teaching format. In addition to this general feedback, the comments also contained some specific suggestions for revising the key feature cases, such as adding more synonyms of correct answers (e.g. alternative spellings or common abbreviations) into the drop-down lists of the long menu format [15] which is used in the cases.


Discussion and outlook

The present project aimed to evaluate the feasibility of formative key feature examinations in dental education, as this teaching format had not yet been implemented in the dental curriculum at the UMG (although being established for years in the medical curriculum at the same institution). Even internationally, it seems that only a few faculties use key feature cases in dental education: We are aware of only two studies in which key feature examinations were used to assess clinical reasoning competence of dental students [17], [18]. In one of these studies, Owlia et al. found that the clinical reasoning skills of 11th semester students were at a low level and they concluded that there is a need to train clinical reasoning processes already during undergraduate dental education [18]. Thus, our concept of using key feature examinations not only for assessing but rather for improving students’ clinical reasoning competence (in the sense of test-enhanced learning) represents an innovative teaching approach in dental education. Moreover, the new teaching format complies with the new German dental licensing regulations which recommend to promote students’ problem-based learning using specific case scenarios.

Creating the new dental key feature cases was time-consuming. This was, on the one hand, due to the complex long menu format [15] and the detailed feedback texts (which were automatically displayed to the students after having answered a key feature question, see attachment 1 [Attach. 1]). On the other hand, it was due to the step-by-step procedure of creating the cases using iterative reviews and revisions by various experts (see figure 1 [Fig. 1]). However, it can be assumed that a kind of learning curve occurs and results in a less time-consuming creation of further dental key feature cases. Besides that, the iterative process of creating cases ensures a high quality of new cases even before piloting (demonstrated by mostly adequate item difficulty and discrimination indices in our project) so that revisions of the new cases are less time-consuming as well.

During pilot testing, the new cases were positively evaluated by the students overall. Comments to the open-ended questions showed that students demanded for further dental key feature cases, as this teaching format provides the possibility to apply and deepen their procedural knowledge within specific clinical contexts. Only one closed item of the questionnaire, evaluating the acquisition of new knowledge by working on the key feature cases, was rated relatively lower than the rest of the items. However, this rating of the students fits well with the intention of the teaching format: since focusing on teaching content of previous lectures and seminars, the key feature cases aim at applying existing knowledge rather than acquiring new knowledge. Besides general feedback on satisfaction, the students also made some specific suggestions for revising the cases. Thus, the results of the evaluation questionnaire (as well as the descriptive analyses of item characteristics) played an important role in validating and optimising the new dental key feature cases [15], [19].

In conclusion, our project shows that formative key feature examinations are feasible in dental education and that dental students can benefit from working on virtual case scenarios. However, whether dental students’ clinical reasoning competence can be improved by completing formative key feature examinations is currently investigated in an ongoing study at the UMG.


Notes

Authorship

Tim Becker and Marc André Ackermann share the first authorship.

Authors’ ORCIDs


Acknowledgements

The authors acknowledge support by the Open Access Publication Funds of the Göttingen University. And they would like to thank all dental students for participating in this project.


Competing interests

The authors declare that they have no competing interests.


References

1.
Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39(1):98-106. DOI: 10.1111/j.1365-2929.2004.01972.x External link
2.
Norman G. Research in clinical reasoning: past history and current trends. Med Educ. 2005;39(4):418-427. DOI: 10.1111/j.1365-2929.2005.02127.x External link
3.
Kassirer JP. Teaching clinical reasoning: case-based and coached. Acad Med. 2010;85(7):1118-1124. DOI: 10.1097/acm.0b013e3181d5dd0d External link
4.
Raupach T, Andresen J, Meyer K, Strobel L, Koziolek M, Jung W, Brown J, Anders S. Test-enhanced learning of clinical reasoning: a crossover randomised trial. Med Educ. 2016;50(7):711-720. DOI: 10.1111/medu.13069 External link
5.
Berman NB, Durning SJ, Fischer MR, Huwendiek S, Triola MM. The role for virtual patients in the future of medical education. Acad Med. 2016;91(9):1217-1222. DOI: 10.1097/ACM.0000000000001146 External link
6.
Hege I, Kononowicz A, Berman NB, Lenzer B, Kiesewetter J . Advancing clinical reasoning in virtual patients – development and application of a conceptual framework. GMS J Med Educ. 2018;35(1):Doc12. DOI: 10.3205/zma001159 External link
7.
Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills. Acad Med. 1995;70(3):194-201. DOI: 10.1097/00001888-199503000-00009 External link
8.
Hrynchak P, Takahashi SG, Nayer M. Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ. 2014;48(9):870-883. DOI: 10.1111/medu.12509 External link
9.
Roediger HL, Karpicke JD. The power of testing memory: basic research and implications for educational practice. Perspect Psychol Sci. 2006;1(3):181-210. DOI: 10.1111/j.1745-6916.2006.00012.x External link
10.
Larsen DP, Butler AC, Roediger HL. Test-enhanced learning in medical education. Med Educ. 2008;42(10):959-966. DOI: 10.1111/j.1365-2923.2008.03124.x External link
11.
Ludwig S, Schuelper N, Brown J, Anders S, Raupach T. How can we teach medical students to choose wisely? A randomised controlled cross-over study of video- versus text-based case scenarios. BMC Med. 2018;16(1):107. DOI: 10.1186/s12916-018-1090-y External link
12.
Schuelper N, Ludwig S, Anders S, Raupach T. The impact of medical students’ individual teaching format choice on the learning outcome related to clinical reasoning. JMIR Med Educ. 2019;5(2):e13386. DOI: 10.2196/13386 External link
13.
Berens M, Becker T, Anders S, Sam AH, Raupach T. Effects of elaboration and instructor feedback on retention of clinical reasoning competence among undergraduate medical students: a randomized crossover trial. JAMA Netw Open. 2022;5(12):e2245491. DOI: 10.1001/jamanetworkopen.2022.45491 External link
14.
Khatami S, Macentee MI. Evolution of clinical reasoning in dental education. J Dent Educ. 2011;75(3):321-328.
15.
Kopp V, Möltner A, Fischer MR. Key-Feature-Probleme zum Prüfen von prozeduralem Wissen: ein Praxisleitfaden. GMS Z Med Ausbild. 2006;23(3):Doc50. Zugänglich unter/available from: https://www.egms.de/static/de/journals/zma/2006-23/zma000269.shtml External link
16.
Möltner A, Schellberg D, Jünger J. Grundlegende quantitative Analysen medizinischer Prüfungen. GMS Z Med Ausbild. 2006;23(3):Doc53. Zugänglich unter/available from: https://www.egms.de/static/de/journals/zma/2006-23/zma000272.shtml External link
17.
Sharma P, Fulzele P, Chaudhary M, Gawande M, Patil S, Hande A. Introduction of key feature problem based questions in assessment of dental students. Int J Cur Res Rev. 2020;12(14):56-61. DOI: 10.31782/IJCRR.2020.121412 External link
18.
Owlia F, Keshmiri F, Kazemipoor M, Rashidi Maybodi F. Assessment of clinical reasoning and diagnostic thinking among dental students. Int J Dent. 2022:1085326. DOI: 10.1155/2022/1085326 External link
19.
Farmer EA, Page G. A practical guide to assessing clinical decision-making skills using the key features approach. Med Educ. 2005;39(12):1188-1194. DOI: 10.1111/j.1365-2929.2005.02339.x External link