Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Medical Education

Date Submitted: Jan 29, 2019
Open Peer Review Period: Feb 1, 2019 - Mar 29, 2019
Date Accepted: Jul 22, 2019
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis

Bientzle M, Hircin E, Kimmerle J, Knipfer C, Smeets R, Gaudin R, Holtz P

Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis

JMIR Med Educ 2019;5(2):e13529

DOI: 10.2196/13529

PMID: 31436166

PMCID: 6724501

Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.

Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis

  • Martina Bientzle; 
  • Emrah Hircin; 
  • Joachim Kimmerle; 
  • Christian Knipfer; 
  • Ralf Smeets; 
  • Robert Gaudin; 
  • Peter Holtz

Background:

Digital learning environments have become very common in the training of medical professionals, and students often use such platforms for exam preparation. Multiple choice questions (MCQs) are a common format in medical exams and are used by students to prepare for said exams.

Objective:

We aimed to examine whether particular learning activities contributed more strongly than others to users’ exam performance.

Methods:

We analyzed data from users of an online platform that provides learning materials for medical students in preparation for their final exams. We analyzed whether the number of learning cards viewed and the number of MCQs taken were positively related to learning outcomes. We also examined whether viewing learning cards or answering MCQs was more effective. Finally, we tested whether taking individual notes predicted learning outcomes, and whether taking notes had an effect after controlling for the effects of learning cards and MCQs. Our analyses from the online platform Amboss are based on user activity data, which supplied the number of learning cards studied and test questions answered. We also included the number of notes from each of those 23,633 users who had studied at least 200 learning cards and had answered at least 1000 test exam questions in the 180 days before their state exam. The activity data for this analysis was collected retrospectively, using Amboss archival usage data from April 2014 to April 2017. Learning outcomes were measured using the final state exam scores that were calculated by using the answers voluntarily entered by the participants.

Results:

We found correlations between the number of cards studied (r=.22; P<.001) and the number of test questions that had been answered (r=.23; P<.001) with the percentage of correct answers in the learners’ medical exams. The number of test questions answered still yielded a significant effect, even after controlling for the number of learning cards studied using a hierarchical regression analysis (β=.14; P<.001; ΔR2=.017; P<.001). We found a negative interaction between the number of learning cards and MCQs, indicating that users with high scores for learning cards and MCQs had the highest exam scores. Those 8040 participants who had taken at least one note had a higher percentage of correct answers (80.94%; SD=7.44) than those who had not taken any notes (78.73%; SD=7.80; t23631=20.95; P<.001). In a stepwise regression, the number of notes the participants had taken predicted the percentage of correct answers over and above the effect of the number of learning cards studied and of the number of test questions entered in step one (β=.06; P<.001; ΔR2=.004; P<.001).

Conclusions:

These results show that online learning platforms are particularly helpful whenever learners engage in active elaboration in learning material, such as by answering MCQs or taking notes.


 Citation

Please cite as:

Bientzle M, Hircin E, Kimmerle J, Knipfer C, Smeets R, Gaudin R, Holtz P

Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis

JMIR Med Educ 2019;5(2):e13529

DOI: 10.2196/13529

PMID: 31436166

PMCID: 6724501

Per the author's request the PDF is not available.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.