gms | German Medical Science

4th Research in Medical Education (RIME) Symposium 2015

19.03-21.03.2015, München

Pedagogical methods for developing scientific reasoning and argumentation skills in higher education

Meeting Abstract

  • Diana L. Ouellette - Ludwig-Maximilians-Universität (LMU) München, München, Germany
  • Katharina Engelmann - Ludwig-Maximilians-Universität (LMU) München, München, Germany
  • Frank Fischer - Ludwig-Maximilians-Universität (LMU) München, München, Germany

4th Research in Medical Education (RIME) Symposium 2015. München, 19.-21.03.2015. Düsseldorf: German Medical Science GMS Publishing House; 2015. DocRD26

doi: 10.3205/15rime71, urn:nbn:de:0183-15rime714

Published: March 12, 2015

© 2015 Ouellette et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License. You are free: to Share - to copy, distribute and transmit the work, provided the original author and source are credited. See license information at http://creativecommons.org/licenses/by-nc-nd/3.0/.


Outline

Text

In the life sciences, two pedagogical methods are typically used to develop scientific reasoning and argumentation (SRA) competencies specific to research methods and experimentation: laboratory exercises (LEs) and research experiences (REs). Less frequently utilized – though more commonly advocated for – are case-based reasoning exercises (CBEs) similar to those popularly used in medical education. Limitations within LEs and REs have been widely discussed in science education literature and CBEs that target SRA specific to research methods could serve as a bridge to the aspects of SRA hindered in LEs and REs.

To our knowledge, no single study has directly compared the effects of these three pedagogical methods on SRA learning or performance. Thus, this study applies meta-analysis to SRA literature from higher education settings in order to evaluate 1) the effect of SRA interventions on SRA performance, 2) potential variability of effect between LEs, REs, and CBEs on SRA performance, and 3) characteristics of the intervention and assessment study that may moderate the observed effect. Preliminary results from a subset of the papers under review for inclusion in the analysis (k=20 [kT>85]) indicate a significant positive effect of SRA interventions on SRA post-test performance and support the use of a random effects model. The preliminary results also indicate significantly different moderation effects for the moderator category pedagogical method, along with several other categories characterizing the reviewed interventions and assessments. Of note, the moderator category instructional delivery method exhibited significantly different and independent moderation effects (subgroups: guided instruction and digital learning environment).

Additional studies are currently under review for inclusion in the meta-analysis as a more substantive analysis is necessary to fully understand the varying effects of these pedagogical methods and perhaps unveil how they could be used synergistically to mitigate the limitations respective to each SRA pedagogical method.