gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

Does prior knowledge affect interaction dynamics and learning achievement in digital problem-based learning? A pilot study

article problem-based learning

  • corresponding author Martin Möser - Goethe University Frankfurt, Department of Operative Dentistry, Frankfurt/Main, Germany
  • author Rico Hermkes - Goethe University Frankfurt, Department of Business Ethics and Business Education, Frankfurt/Main, Germany
  • author Natalie Filmann - Goethe University Frankfurt, Department of Biostatistics and Mathematical Modelling, Frankfurt/Main, Germany
  • author Seon-Yee Harsch - Goethe University Frankfurt, Department of Operative Dentistry, Frankfurt/Main, Germany
  • author Stefan Rüttermann - Goethe University Frankfurt, Department of Operative Dentistry, Frankfurt/Main, Germany
  • author Susanne Gerhard-Szép - Goethe University Frankfurt, Department of Operative Dentistry, Frankfurt/Main, Germany

GMS J Med Educ 2023;40(6):Doc69

doi: 10.3205/zma001651, urn:nbn:de:0183-zma0016518

This is the English version of the article.
The German version can be found at: http://www.egms.de/de/journals/zma/2023-40/zma001651.shtml

Received: January 18, 2023
Revised: June 1, 2023
Accepted: August 8, 2023
Published: November 15, 2023

© 2023 Möser et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.


Abstract

Objective: Previous research on problem-based learning (PBL) describes that videotaped observations develop meaningful insights into cognitive processes in tutorial groups. Analysis regarding the amount of prior knowledge on learning achievement has not been investigated in medical education so far, although both are key factors of PBL success. Thus, we intended to analyse videos of digital problem-based learning (dPBL) sessions, focusing on knowledge acquisition and interaction dynamics among groups with different levels of prior knowledge to reveal any distinctions.

Methods: This study employed a pilot design by dividing 60 dental students into twelve subgroups with less or more prior knowledge, determined by a pre-semester multiple choice test (MCQ). The groups engaged in videotaped dPBL cases, which were examined regarding group interactions and tutor effectiveness. The learning achievement was assessed through a post-semester MCQ, an oral and practical exam.

Results: The video analysis showed that dPBL groups with less prior knowledge achieved significantly higher tutor effectiveness and group interaction utterances, but that the percentage of time in which utterances occurred was similar in both groups. Related to the MCQ results, the students with less prior knowledge learned four times more than those with profound previous abilities, but no significant difference was found in the results of the oral exam and practical exam.

Conclusions: The interaction dynamics in dPBL depend on the group’s amount of prior knowledge. Especially groups including participants with less prior knowledge seemed to benefit from dPBL in comparison to groups with more prior knowledge. The dPBL groups acquired knowledge in different ways during the courses but, finally, all students arrived at a similar level of knowledge.

Keywords: problem-based learning, PBL, video-study, digital, interaction, prior knowledge, learning achievement


Introduction

More than 50 years after its first implementation, problem-based learning (PBL) continues to be approved and has spread throughout medical education, enhancing students’ abilities and preparedness for real-life challenges [1].

Research in the last decade has revealed several benefits of PBL as being a more enjoyable experience that can foster students’ knowledge construction processes and it has been described that participants relish the self-directed nature of PBL [2]. Students who frequently take part in PBL sessions achieve notably better results in assessments of knowledge and evaluate it as a more pleasing way to study than teacher-centred lectures [3]. Furthermore, current developments such as the interprofessional PBL also showed a high level of acceptance among the participants [4].

There are even educators who consider PBL as a universal remedy for educational shortcomings while others vehemently refuse its implementation and call the impact and value of PBL on learning into question due to its unconventional philosophy and instructional practices [5], [6]. Indeed, there is also evidence that traditional courses tend to lead to an increased achievement of scientific knowledge in comparison to PBL [7].

Today we know several factors affecting the outcomes of learning in PBL: the way it is used in various domains, for participants of different ages and subject matters are some of the parameters that influence PBL results [8]. The effectiveness of PBL and the participants’ learning are largely dependent on the tutors’ pedagogical abilities, academic background, training concept and maintenance to the quality of discussion [9], [10], [11], [12]. However, there is still no final conclusion on the best way to optimise PBL-teaching [13].

During the COVID-19 pandemic, PBL research showed new insights regarding synchronous online learning and many faculties provide meaningful results by digitalising their curricula into an e-learning platform [14], [15], [16], [17], [18], [19], [20], [21], [22]. Furthermore, it has been described that direct or videotaped observations or corpus analysis of recordings develop unadulterated information about cognitive processes in small groups [23], [24]. Previous research on the analysis of group interactions and tutor effectiveness in PBL has shown promising results [24], [25], [26], [27], [28], [29], [30], [31], however, the influence of the amount of prior knowledge on learning achievement has never been investigated in medical education, although both are considered as key factors of PBL success [32]. Research in other sciences, e.g. social science and mathematics, has already successfully investigated this relationship and clarified its significance [33], [34].

Due to the current state of research, we intended to analyse videos of digital PBL (dPBL) sessions to concentrate on group interactions, tutor effectiveness and knowledge acquisition between groups with a different amount of prior knowledge. To search for distinctions of both groups and to receive multi-perspective options, videotaped dPBL sessions, multiple-choice tests (MCQ), a structured oral exam (SOE) and an objective structured practical exam (OSPE) were analysed.

Research questions:

  • Are there significant differences regarding the group interactions between the two groups?
  • Are there significant differences regarding the tutor effectiveness between the two groups?
  • Are there significant differences regarding the knowledge acquisition of the two groups?

Materials and method

1. Context and participants

This study was conducted at the dental school of the Goethe University in Frankfurt, Germany.

The students of the first clinical semester (n=60) were divided into two groups assessed by their prior knowledge, which was amongst others determined by a completion of a pre-semester multiple-choice test (MCQ). The division of the two groups was based on the number of points reached in the pre-semester multiple-choice test, the preclinical examination grade and the general qualification grade for university entrance in Germany (Abitur). The gender also played a role in the division so that women (n=37) and men (n=23) were equally distributed in the final twelve dPBL subgroups. Students with less prior knowledge were assigned to group A (n=30) and students with more prior knowledge were assigned to group B respectively (n=30). Both groups were further separated into subgroups (A 1-6 and B 1-6) using the above described parameters to establish similar subgroups, each with five participants and a tutor.

During the semester, the twelve groups worked on five dPBL cases. Based on the seven-step PBL model established at the University of Maastricht [35], the first five steps were processed in the first digital session, the sixth step in a self-study period and the seventh step in the second digital session. As Barrows [36] recommended, an eighth step was added in the second digital session to evaluate the dPBL process.

An established expert (Master of Medical Education) trained the tutors on how to supply dPBL in a so-called non-facilitative and facilitative style [37]. Tutors were instructed first to guide the dPBL sessions in a facilitative style, then, as the students gained more experience throughout their dPBL cases, the tutors were told to guide them in a non-facilitative manner. All of the tutors were dentists and each of them attended to a group of students with less prior knowledge and a group of students with more prior knowledge.

Neither the students nor the tutors knew about the assessment of previous knowledge and the determination to group A or B (double-blind setting). Finally, the sessions were accomplished on Vydeo® (Vydeo, Berlin, Germany), an online platform where the students and the tutor could see each other via video camera and were able to work on a shared document. At the end of the semester, the students were required to pass a post-semester MCQ, a graded SOE and OSPE.

2. Video material and coding scheme

2.1. Setting

For the quantitative and qualitative analysis of the videotaped dPBL sessions the software Interact® (Version 18, Mangold International GmbH, Arnstorf, Germany) was used. A second rater independently coded seven hours of a randomly selected part (about 20% of the total video material [26]) after passing a two-week rater training. The inter-rater reliability to Cohen’s kappa was 0.81, thus the first rater analysed all subsequent videos.

2.2. Analysis variables and coding scheme

Based on Visschers-Pleijers’ coding scheme [26], we analysed three different types of group interaction: learning-orientated interactions (i.e. exploratory questioning, cumulative reasoning, and handling conflicts about knowledge), procedural interactions and irrelevant/off-task interactions (see table 1 [Tab. 1]). Irrelevant/off-task interactions contained the subtype “a period silence” which was also measured and analysed.

Furthermore, we examined the tutor effectiveness using a coding scheme based on the evaluation sheet developed by Dolmans [38] (see table 2 [Tab. 2]) as video analysis criteria. Therefore, four approaches of how the tutors stimulated the students’ learning were observed: constructive/active learning, self-directed learning, contextual learning and collaborative learning. The intra-personal behaviour as a tutor was also examined.

As soon as a participant (student or tutor) made an utterance, with a possible range from one word to several sentences, it was classified according to the given criteria [26], [38]. Every single utterance of the students and the tutors of both groups was examined by the frequency of occurrence and as a percentage of the total session time. From the moment that the utterance began, it was coded and its length determined (see figure 1 [Fig. 1]). Thus, a total of 12,766 student utterances and 1,476 tutor utterances were examined.

3. Multiple-choice test

The pre- and post-semester MCQ contained 50 questions that dealt with dental materials and instruments, dental treatment instructions and diagnoses. The MCQ content was also included in the dPBL sessions. The tests were based on a validated template, a previous analysis of which showed a Cronbach’s α of 0.63 (pre-test) and 0.67 (post-test) [37].

4. Statistical analysis

In order to analyse the pre- and post-semester MCQ, SOE and OSPE data, the software BiAs® (Version 11.12, Frankfurt, Germany) was used and significant differences (p<0.05) were identified by using a Wilcoxon-Mann-Whitney U Test.

The video rater’s data were examined using mixed effect models as linear mixed models and generalised linear mixed models [39]. An Interact® software tool especially developed for analysing inter-rater reliability was used to check Cohen’s kappa.

A rough sample estimation was adjusted by examining several factors. Firstly, the PBL time analysed in previous research or the number of PBL sessions conducted were examined as reference points [25], [26], [27], [29], [40], [41], [42], [43], [44], [45], [46]. Secondly, the units of analysis, such as individual videos or segments within videos, were evaluated. By averaging these aspects, we found that at least 15 videos were needed to gather statistically significant information for this pilot study.

5. Application for ethical approval

According to the chairman of the ethics committee of the Goethe University in Frankfurt, no ethical approval was required, because relating to the video material, being recorded was voluntary. The participants knew that they were recorded on video and could approve or decline the recording before each session was started. There were no disadvantages for the participants if they did not agree to be videotaped. All data protection requirements were met. Participation in the dPBL sessions and all exams was mandatory to pass the semester.


Results

1. Analysis of the video data

As the recording of the dPBL sessions was optional, 34 out of 120 PBL sessions were recorded and analysed (dropout rate of 70.9%). The 34 videos consisted of 15 videos from group A with less prior knowledge and 19 videos from group B with more prior knowledge. In both groups the first and second dPBL sessions were evenly divided.

1.1. Learning-orientated interactions analysis

Both groups spent most of the time on cumulative reasoning (group A: 67.3%; group B: 65.7%), while procedural tasks took the least time in both groups (group A: 2.5%; group B: 1.9%). The data group with the highest score of utterances was cumulative reasoning (group A: n=364.9; group B: n=229.6), while the most rare utterances concerned procedural (group A: n=18.7; group B: n=5.1).

Related to the frequency of the occurrence of group interactions, group A scored significantly more utterances in handling conflicts about knowledge, open questions, other questions, statements, disagreements and in procedural interactions (all p<0.05), than group B. There was no significant difference between the groups in the percentage of total session time spent on that types of interactions (see table 1 [Tab. 1]).

1.2. Tutor effectiveness analysis

Most utterances in the tutor data groups were focussed on constructive/active learning in both groups (group A: 59.3%; group B: 58.3%). The intra-personal behaviour of the tutor engaged the least amount of time (group A: 5%; group B: 7.6%). Constructive/active learning was the data group with the most tutor utterances (group A: n=39; group B: n=30.2), while the least time was spent in collaborative learning (group A: n=4.3; group B: n=2.5) (see table 2 [Tab. 2]).

2. Knowledge acquisition

Group A scored significantly fewer points in the MCQ pre-test but had significantly more points in the MCQ post-test, both compared to group B (see table 3 [Tab. 3]).

Group A had a significant increase of knowledge of 13.7 points and group B of 3.7 points. On average, all students significantly increase their knowledge of 8.7 points (see table 4 [Tab. 4]).

There were no significant differences between both groups in structured oral exam or objective structured practical exam grades (see table 5 [Tab. 5]).


Discussion

This study’s scientific relevance lies in exploring significant differences in group interactions, tutor effectiveness, and knowledge acquisition between two groups having a different level of prior knowledge. By investigating these aspects, it contributes to understanding the dynamics of group interactions, the effectiveness of tutors, and the impact of prior knowledge on knowledge acquisition. These findings provide valuable insights for educational practitioners and researchers, informing the design and implementation of instructional strategies and interventions to enhance group interactions, optimize tutor effectiveness, and promote effective knowledge acquisition in educational settings. Furthermore, this research provides new information into the influence of prior knowledge to dPBL functioning and gives an example of how dPBL could still be used and analysed during pandemic situations.

Although this study is titled as a pilot study, we examined almost 34 hours and 26 minutes of video material including 14,366 utterances; this is substantially more material than comparable studies based on analysing group interactions or tutor effectiveness in PBL [24], [25], [26], [27], [28], [29], [30], [31], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52], [53]. Nevertheless, we have scheduled a follow-up study with an exact sample estimation to verify our findings and to produce further results.

1. Group interaction

Students with less prior knowledge (group A) performed significantly more utterances on several types of interactions with nearly the same time spent on that interactions as those with more prior knowledge (group B), because group A’s utterances were mostly shorter.

A possible explanation is that group may not have had the prior knowledge to discuss, argue, ask or explain in depth with each other due to the missing prior knowledge [54], [55]. Additionally, it is also possible that a PBL task is more cognitively activating for one group of learners compared to another, depending on the groups level of prior knowledge. The concept of cognitive activation stems from constructivist learning, refers to the adaptivity of different problems and tasks for different learners and involves evoking cognitive conflicts to initiate learning processes [56], [57]. It might be beneficial to include the measurement of how learners were cognitively activated by the tasks in subsequent research. Cognitive activation can be assumed to be affective for subsequent group discussions, whereby groups with less activating tasks should be less engaged in the discussion.

Therefore, group A showed a higher rate of conversation tasks and they exchanged information more often which may have supported the elaboration of their prior knowledge [49].

Furthermore, we cannot confirm the results of Visschers-Pleijers et al. [26] and De Grave et al. [58] who believed that if pre-set working rules and role division are retained, then off-task interactions would rarely be measured. Our observation suggests that a period of silence in dPBL was mostly used by students as an “individual thinking period” before the group consensus was reached, an idea that Gukas et al. [47] also considered. In further research, it would be useful to create “silence” as a new topic of interaction, without it being a part of the off-task interactions, because there is evidence that the groups scored highly on the indices of learning, even when they kept silent for some time and that the periods of silence did not indicate that the students were not learning effectively [44], [47].

2. Tutor effectiveness

The only difference concerning the tutor effectiveness was that in group A, the tutor stimulated the students significantly more often to understand the underlying mechanisms/theories regarding the frequency of occurrence. In relation to the percentage of the total session time, there was no difference between the groups on this point. These findings show that the tutors spent less time per utterance for stimulating students with less prior knowledge to understand the underlying mechanisms/theories. A possible reason for this is that the students with more prior knowledge may have demanded explanations of the underlying mechanisms in more detail while group A may have been satisfied with less profound statements.

Overall, the tutors spent more than half of their speaking time on stimulating active learning while only a mere 10%, approximately, on stimulating self-directed learning. Previous research has suggested that self-directed learning is a key to medical skill development in undergraduate curricula [59]. Thus, if the tutors generally spend less time on stimulating self-directed learning, then it may be useful to examine this effect in further research in order to optimise the teaching and training of tutors in the future.

3. Knowledge acquisition

In relation to their knowledge acquisition, students with more prior knowledge attained 3.7 points more in the MCQ post-test than in the MCQ pre-test, while students with less prior knowledge achieved 13.7 points more on average. It is remarkable that group A increased their knowledge during the dPBL sessions almost four times more than group B. However, group B started with a relatively high result in the pre-test, so it would be more difficult to raise their own achievement in the post-test (ceiling-effect). In addition, group A scored significantly fewer points in the pre-test and significantly more points in the post-test compared to group B. Some of the tutoring approaches could explain group A’s relatively high level of knowledge acquisition when compared to group B. Firstly, group A had about 180 utterances more per dPBL session on average than group B (group A: 489.3; group B: 306.7); this suggests that there may have been more conversations and more information flow inside group A which would consequently support their knowledge acquisition. Secondly, group A had significantly higher scores in terms of the frequency of occurrence of handling conflicts about knowledge. It is thoroughly valuable to engage on this type of interaction because PBL indicates learning by provoking cognitive conflicts from controversies between the students’ knowledge and the problem they are dealing with [60]. Thirdly, as the participants’ prior knowledge is activated, they become more easily able to find gaps in their knowledge to work on (activation–elaboration hypothesis) and group B may have hindered their knowledge acquisition because they did not have extensive knowledge gaps to elaborate on [61]. Lastly, it could be possible that group A may have prepared themselves better for the dPBL cases, knowing that they had performed worse in the pre-semester MCQ and that they lacked the knowledge of how to deal with the cases instead of trusting in their solid prior knowledge as group B had done. However, there were no significant differences between both groups in the SOE or OSPE grades at the end of the semester. These findings suggest that both groups ended the semester with a similar level of knowledge, but that the two groups acquired this in different ways during the dPBL cases. In addition, dPBL may stimulate the learning process of those participants who had less prior knowledge the most, which may be a possible hint to answer Dolman’s question of “under which conditions is PBL effective and for what kinds of students?” [7]. To answer this question more precisely further research should also consider the ceiling-effect [62].


Limitations

Several factors limit the generalisability of this study. Firstly, previous research has shown that when being recorded during PBL the participants are not as spontaneous as they are naturally, thus, being videotaped can influence the participant’s behaviour [29]. Furthermore, the participants’ knowledge acquisition may not be entirely based on dPBL as the students also had other lectures and practical trainings during the semester that imparted knowledge. Another point affecting our results is that the participants gained experience during their dPBL cases and this may have changed their group interaction by increasing their expertise or by changing tutoring style (facilitative to non-facilitative [37]). However, our findings only reflect the entirety of the group interaction and, thus, not its trends.


Conclusion

Interaction dynamics in dPBL depend on the group’s amount of prior knowledge. In particular, groups including participants with less prior knowledge seem to benefit from dPBL by tending towards a fast information flow with shorter utterances in a high frequency in comparison to groups with more prior knowledge. The groups acquired knowledge in different ways during the courses, however, all students did not achieve any significant differences regarding the structured oral exam and objective structured practical exam in relation to their prior knowledge.


Competing interests

The authors declare that they have no competing interests.


References

1.
Moallem M. Section 1: Understanding PBL: Historical and Theoretical Foundations. In: Moallem M, Hung W, Dabbagh N, editors. The Wiley handbook of problem-based learning. Hoboken, Nj: John Wiley & Sons; 2019. p.1-3. DOI: 10.1002/9781119173243 External link
2.
Anderson V, Reid K. Students’ perception of a problem-based learning scenario in dental nurse education. Eur J Dent Educ. 2012;16(4):218-223. DOI: 10.1111/j.1600-0579.2012.00745.x External link
3.
McHarg J, Kay EJ, Coombes LR. Students’ engagement with their group in a problem-based learning curriculum. Eur J Dent Educ. 2011;16(1):e106-e110. DOI: 10.1111/j.1600-0579.2011.00682.x External link
4.
Dreier-Wolfgramm A, Homeyer S, Oppermann RF, Hoffmann W. A model of interprofessional problem-based learning for medical and nursing students: Implementation, evaluation and implications for future implementation. GMS J Med Educ. 2018;35(1):Doc13. DOI: 10.3205/zma001160 External link
5.
Servant-Miklos V, Norman GR, Schmidt HG. A Short Intellectual History of Problem-Based Learning. In: Moallem M, Hung W, Dabbagh N, editors. The Wiley handbook of problem-based learning. Hoboken, Nj: John Wiley & Sons; 2019. p.3-24. DOI: 10.1002/9781119173243.ch1 External link
6.
Moallem M, Hung W, Dabbagh N. Section II, Research in PBL. In: Moallem M, Hung W, Dabbagh N, editors. The Wiley handbook of problem-based learning. Hoboken, Nj: John Wiley & Sons; 2019. p.105-106. DOI: 10.1002/9781119173243.part2 External link
7.
Dolmans D, Gijbels D. Research on problem-based learning: future challenges. Med Educ. 2013;47(2):214-218. DOI: 10.1111/medu.12105 External link
8.
Moallem M. Effects of PBL on Learning Outcomes, Knowledge Acquisition, and Higher-Order Thinking Skills. In: Moallem M, Hung W, Dabbagh N, editors. The Wiley handbook of problem-based learning. Hoboken, Nj: John Wiley & Sons; 2019. p.107-111. DOI: 10.1002/9781119173243.ch5 External link
9.
Du X, Nomikos M, Ali K, Lundberg A, Abu‐Hijleh M. Health educators’ professional agency in negotiating their problem‐based learning (PBL) facilitator roles: Q study. Med Educ. 2022;56(8):847-857. DOI: 10.1111/medu.14792 External link
10.
Grasl MC, Kremser K, Breckwoldt J, Gleiss A. Does the tutors’ academic background influence the learning objectives in problem-based learning? GMS J Med Educ. 2020;37(1):Doc8. DOI: 10.3205/zma001301 External link
11.
Vogt K, Pelz J, Stroux A. Refinement of a training concept for tutors in problem-based learning. GMS J Med Educ. 2017;34(4):Doc38. DOI: 10.3205/zma001115 External link
12.
Abdalla ME, Eladl MA. Student perception of the effect of problem familiarity on group discussion quality in a problem-based learning environment. GMS J Med Educ. 2019;36(3):Doc29. DOI: 10.3205/zma001237 External link
13.
Ma CW. How to advance medical education using journal articles? Insight from problem-based learning. GMS J Med Educ. 2022;39(4):Doc48. DOI: 10.3205/zma001569 External link
14.
Khan RA, Atta K, Sajjad M, Jawaid M. Twelve tips to enhance student engagement in synchronous online teaching and learning. Med Teach. 2022;44(6):601-606. DOI: 10.1080/0142159X.2021.1912310 External link
15.
Jennebach J, Ahlers O, Simonsohn A, Adler M, Özkaya J, Raupach T, et al. Digital patient-centred learning in medical education: A national learning platform with virtual patients as part of the DigiPaL project. GMS J Med Educ. 2022;39(4):Doc47. DOI: 10.1080/0142159X.2021.1912310 External link
16.
Winzer A, Jansky M. Digital lesson to convey the CanMEDS roles in general medicine using problem-based learning (PBL) and peer teaching. GMS J Med Educ. 2020;37(7):Doc64. DOI: 10.3205/zma001357 External link
17.
Ertl S, Steinmair D, Löffler-Stastka H. Encouraging communication and cooperation in e-learning: solving and creating new interdisciplinary case histories. GMS J Med Educ. 2021;38(3):Doc62. DOI: 10.3205/zma001458 External link
18.
Huber J, Witti M, Schunk M, Fischer MR, Tolks D. The use of the online Inverted Classroom Model for digital teaching with gamification in medical studies. GMS J Med Educ. 2021;38(1):Doc3. DOI: 10.3205/zma001399 External link
19.
Hege I, Kononowicz AA, Berman NB, Lenzer B, Kiesewetter J. Advancing clinical reasoning in virtual patients – development and application of a conceptual framework. GMS J Med Educ. 2018;35(1):Doc12. DOI: 10.3205/zma001159 External link
20.
Kollewe T, Ochsendorf F. Medical didactics during the pandemic: the asynchronous online seminar “Written Examinations” of the Frankfurter Arbeitsstelle für Medizindidaktik. GMS J Med Educ. 2021;38(1):Doc18. DOI: 10.3205/zma001414 External link
21.
Dadaczynski K, Tolks D. Digital health communication and health literacy in times of COVID-19. Planning and implementation of a special course of study in health promotion and prevention. GMS J Med Educ. 2021;38(1):Doc31. DOI: 10.3205/zma001427 External link
22.
Abler M, Bachmaier R, Hawelka B, Prock S, Schworm S, Merz AK, Keil S. “It just magically happened overnight!” - support for the digitalization of medical teaching provided by an interdisciplinary e-tutor team. GMS J Med Educ. 2020;37(7):Doc75. DOI: 10.3205/zma001368 External link
23.
Dolmans DH, Schmidt HG. What Do We Know About Cognitive and Motivational Effects of Small Group Tutorials in Problem-Based Learning? Adv Health Sci Educ Theory Pract. 2006;11(4):321-336. DOI: 10.1007/s10459-006-9012-8 External link
24.
Da Silva AL, Dennick R. Corpus analysis of problem-based learning transcripts: an exploratory study. Med Educ. 2010;44(3):280-288. DOI: 10.1111/j.1365-2923.2009.03575.x External link
25.
Jaarsma AD, Dolmans DD, Muijtjens AM, Boerboom TT, van Beukelen P, Scherpbier AJ. Students’ and teachers’ perceived and actual verbal interactions in seminar groups. Med Educ. 2009;43(4):368-376. DOI: 10.1111/j.1365-2923.2009.03301.x External link
26.
Visschers-Pleijers AJ, Dolmans DH, Leng BA, Wolfhagen IH, Vleuten CP. Analysis of verbal interactions in tutorial groups: a process study. Med Educ. 2006;40(2):129-137. DOI: 10.1111/j.1365-2929.2005.02368.x External link
27.
Cianciolo AT, Kidd B, Murray S. Observational analysis of near-peer and faculty tutoring in problem-based learning groups. Med Educ. 2016;50(7):757-767. DOI: 10.1111/medu.12969 External link
28.
Kindler P, Grant C, Kulla S, Poole G, Godolphin W. Difficult incidents and tutor interventions in problem-based learning tutorials. Med Educ. 2009;43(9):866-873. DOI: 10.1111/j.1365-2923.2009.03423.x External link
29.
Gilkison A. Techniques used by “expert” and “non-expert” tutors to facilitate problem-based learning tutorials in an undergraduate medical curriculum. Med Educ. 2003;37(1):6-14. DOI: 10.1046/j.1365-2923.2003.01406.x External link
30.
Basu Roy R, McMahon GT. Video-based cases disrupt deep critical thinking in problem-based learning. Med Educ. 2012;46(4):426-435. DOI: 10.1111/j.1365-2923.2011.04197.x External link
31.
Nieminen J, Sauri P, Lonka K. On the relationship between group functioning and study success in problem-based learning. Med Educ. 2006;40(1):64-71. DOI: 10.1111/j.1365-2929.2005.02344.x External link
32.
Norman GR, Schmidt HG. Effectiveness of problem-based learning curricula: theory, practice and paper darts. Med Educ. 2000;34(9):721-728. DOI: 10.1046/j.1365-2923.2000.00749.x External link
33.
Van Blankenstein FM, Dolmans DH, Van der Vleuten CP, Schmidt HG. Relevant prior knowledge moderates the effect of elaboration during small group discussion on academic achievement. Instr Sci. 2013;41(4):729-744. DOI: 10.1007/s11251-012-9252-3 External link
34.
Xhomara N. How prior knowledge, learning, teaching and assessment affect students’ achievements in Mathematics. Res Educ Learn Innov Arch. 2020;25:68-91. DOI: 10.7203/realia.25.15780 External link
35.
Davis MH, Harden RM. AMEE Medical Education Guide No. 15: Problem-based learning: a practical guide. Med Teach. 1999;21(2):130-140. DOI: 10.1080/01421599979743 External link
36.
Barrows HS, Tamblyn RM. Problem-based learning: an approach to medical education. New York: Springer Pub. Co.; 1980.
37.
Gerhardt-Szep S, Kunkel F, Moeltner A, Hansen M, Böckers A, Rüttermann S, Ochsendorf F. Evaluating differently tutored groups in problem-based learning in a German dental curriculum: a mixed methods study. BMC Med Educ. 2016;16:14. DOI: 10.1186/s12909-015-0505-0 External link
38.
Dolmans DH, Ginns P. A short questionnaire to evaluate the effectiveness of tutors in PBL: validity and reliability. Med Teach. 2005;27(6):534-538. DOI: 10.1080/01421590500136477 External link
39.
Hedderich J, Sachs L. Angewandte Statistik, Methodensammlung mit R. Berlin: Springer-Verlag Berlin; 2016. DOI: 10.1007/978-3-662-45691-0 External link
40.
Oh SA, Chung EK, Woo YJ, Han ER, Kim YO. Analysis of Verbal Interactions in Problem-based Learning. Korean J Med Educ. 2010;22(2):131-139. DOI: 10.3946/kjme.2010.22.2.131 External link
41.
Koschmann T, Evensen DH, Glenn P, Hall R, Frederiksen C. Five readings of single text: transcript of video analysis session. In: Hmelo CE, Evensen DH, editors. Problem-based learning: a research perspective on learning interaction. Mahwah: Lawrence Erlbaum Associates; 2008. p.137-166.
42.
Imafuku R. Japanese first-year PBL students’ learning process: a classroom discourse analysis. In: Bridges S, McGrath C, Whitehill TL, editors. Problem-based learning in clinical education. The next generation. New York: Springer; 2012. p.153-170. DOI: 10.1007/978-94-007-2515-7_10 External link
43.
Lee GH, Lin CS, Lin YH. How experienced tutors facilitate tutorial dynamics in PBL groups. Med Teach. 2012;35(2):e935-942. DOI: 10.3109/0142159X.2012.714883 External link
44.
Nest Jin J. Sounds of silence: examining silence in problem-based learning (PBL) in Asia. In: Bridges S, McGrath C, Whitehill TL, editors. Problem-based learning in clinical education. The next generation. New York: Springer; 2012. p.171-188. DOI: 10.1007/978-94-007-2515-7_11 External link
45.
Visschers-Pleijers AJ, Dolmans DH, Wolfhagen IH, van der Vleuten CP. Exploration of a method to analyze group interactions in problem-based learning. Med Teach. 2004;26(5):471-478. DOI: 10.1080/01421590410001679064 External link
46.
Lee GH, Lin YH, Tsou KI, Shiau SJ, Lin CS. When a Problem-Based Learning Tutor Decides to Intervene. Acad Med. 2009;84(10):1406-1411. DOI: 10.1097/ACM.0b013e3181b6b433 External link
47.
Gukas ID, Leinster SJ, walker R. Verbal and nonverbal indices of learning during problem-based learning (PBL) among first year medical students and the threshold for tutor intervention. Med Teach. 2010;32(1):e5-11. DOI: 10.3109/01421590903398232 External link
48.
Hmelo-Silver CE. Creating a learning space in problem-based learning. Interdiscip J Probl Based Learn. 2013;7:1. DOI: 10.7771/1541-5015.1334 External link
49.
Aarnio M, Lindblom-Ylänne S, Nieminen J, Pyörälä E. Dealing with conflicts on knowledge in tutorial groups. Adv Health Sci Educ Theory Pract. 2012;18(2):215-230. DOI: 10.1007/s10459-012-9366-z External link
50.
Azer SA. Interactions Between Students and Tutor in Problem-Based Learning: The Significance of Deep Learning. Kaohsiung J Med Scis. 2009;25(5):240-249. DOI: 10.1016/S1607-551X(09)70068-3 External link
51.
Tang S, Long M, Tong F, Wang Z, Zhang H, Sutton-Jones KL. A Comparative Study of Problem-Based Learning and Traditional Approaches in College English Classrooms: Analyzing Pedagogical Behaviors Via Classroom Observation. Behav Sci (Basel). 2020;10(6):105. DOI: 10.3390/bs10060105 External link
52.
Faidley J, Evensen DH, Salisbury-Glennon J, Glenn J, Hmelo CE. How are we doing? Methods of assessing group processing in a problem-based learning context. In: Hmelo CE, Evensen DH, editors. Problem-based learning: a research perspective on learning interaction. Mahwah: Lawrence Erlbaum Associates; 2008. p.109-138.
53.
Duek J. Whose group is it, anyway? Equity of student discourse in problem-based learning (PBL). In: Hmelo CE, Evensen DH, editors. Problem-based learning: a research perspective on learning interaction. Mahwah: Lawrence Erlbaum Associates; 2008. p.75-108.
54.
Zohar A, Nemet F. Fostering students' knowledge and argumentation skills through dilemmas in human genetics. J Res Sci Teach. 2002;39(1):35-62. DOI: 10.1002/tea.10008 External link
55.
Schmidt HK, Rothgangel M, Grube D. Does prior domain-specific content knowledge influence students’ recall of arguments surrounding interdisciplinary topics? J Adolesc. 2017;61:96-106. DOI: 10.1016/j.adolescence.2017.10.001 External link
56.
Kunter M, Baumert J, Blum W, Klusmann U, Krauss S, Neubrand M, editors. Cognitive Activation in the Mathematics Classroom and Professional Competence of Teachers. Boston, MA: Springer US; 2013. DOI: 10.1007/978-1-4614-5149-5 External link
57.
Rieser S, Decristan J. Kognitive Aktivierung in Befragungen von Schülerinnen und Schülern. Unterscheidung zwischen dem Potential zur kognitiven Aktivierung und der individuellen kognitiven Aktivierung. Päd Psychol. 2023;1-15. DOI: 10.1024/1010-0652/a000359 External link
58.
De Grave WS, Boshuizen HP, Schmidt HG. Problem based learning: Cognitive and metacognitive processes during problem analysis. Instr Sci. 1996;24(5):321-341.
59.
Röcker N, Lottspeich C, Braun LT, Lenzer B, Frey J, Fischer MR, Schmidmaier R. Implementation of self-directed learning within clinical clerkships. GMS J Med Educ. 2021;38(2):Doc43. DOI: 10.3205/zma001439 External link
60.
Boelens R, De Wever B, Rosseel Y, Verstraete AG, Derese A. What are the most important tasks of tutors during the tutorials in hybrid problem-based learning curricula? BMC Med Educ. 2015;15:84. DOI: 10.1186/s12909-015-0368-4 External link
61.
Schmidt HG, Rotgans JI, Yew EH. The process of problem-based learning: what works and why. Med Educ. 2011;45(8):792-806. DOI: 10.1111/j.1365-2923.2011.04035.x External link
62.
Garin O. Ceiling Effect. In: Michalos AC, editor. Encyclopedia of Quality of Life and Well-Being Research. Dordrecht: Springer; 2014. DOI: 10.1007/978-94-007-0753-5_296 External link