Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Monday, March 11, 2019 at 4:00 PM to 4:30 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Advertisement

Currently accepted at: Journal of Medical Internet Research

Date Submitted: Apr 27, 2018
Open Peer Review Period: Apr 29, 2018 - Jun 24, 2018
Date Accepted: Oct 26, 2018
(closed for review but you can still tweet)

This paper has been accepted and is currently in production.

It will appear shortly on 10.2196/10793

The final accepted version (not copyedited yet) is in this tab.

The final, peer-reviewed published version of this preprint can be found here:

Improving Electronic Health Record Note Comprehension With NoteAid: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Crowdsourced Workers

Lalor JP, Woolf B, Yu H

Improving Electronic Health Record Note Comprehension With NoteAid: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Crowdsourced Workers

J Med Internet Res 2019;21(1):e10793

DOI: 10.2196/10793

PMID: 30664453

PMCID: 6351990

Improving Electronic Health Record Note Comprehension With NoteAid: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Crowdsourced Workers

  • John P Lalor; 
  • Beverly Woolf; 
  • Hong Yu

ABSTRACT

Background:

Patient portals are becoming more common, and with them the ability of patients to access their personal Electronic Health Records (EHRs). EHRs, in particular the free-text EHR notes, often contain medical jargon and terms that are difficult for laypersons to understand. There are many Web-based resources for learning more about particular diseases or conditions, including systems that directly link to lay definitions or educational materials for medical concepts.

Objective:

Our goal is to determine whether use of one such tool, NoteAid, leads to higher EHR note comprehension ability.

Methods:

In this work we compare a passive, self-service educational resource (MedlinePlus) with an active resource (NoteAid) where definitions are provided to the user for medical concepts that the system identifies. We use Amazon Mechanical Turk (AMT) to recruit individuals to complete ComprehENotes, a new test of EHR note comprehension.

Results:

Mean scores for individuals with access to NoteAid are significantly higher than the mean baseline scores, both for raw scores (P=0.01) and estimated ability (P=0.02).

Conclusions:

In our experiments we show that the active intervention leads to significantly higher scores on the comprehension test as compared with a baseline group with no resources provided. In contrast, there is no significant difference between the group that was provided with the passive intervention and the baseline group. Finally, we analyze the demographics of the individuals who participated in our AMT task and show differences between groups that align with the current understanding of health literacy between populations.


 Citation

Please cite as:

Lalor JP, Woolf B, Yu H

Improving Electronic Health Record Note Comprehension With NoteAid: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Crowdsourced Workers

Journal of Medical Internet Research. (forthcoming/in press)

DOI: 10.2196/10793

URL: https://preprints.jmir.org/preprint/10793

PMID: 30664453

PMCID: 6351990

Download Accepted Manuscript PDF

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.