Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Human Factors

Date Submitted: Jul 2, 2021
Date Accepted: Apr 19, 2022

The final, peer-reviewed published version of this preprint can be found here:

Assessing the Usability of a Clinical Decision Support System: Heuristic Evaluation

Cho H, Keenan G, Madandola OO, Dos Santos FC, Macieira TGR, Bjarnadottir RI, Priola KJ, Lopez KD

Assessing the Usability of a Clinical Decision Support System: Heuristic Evaluation

JMIR Hum Factors 2022;9(2):e31758

DOI: 10.2196/31758

PMID: 35536613

PMCID: 9090311

Assessing Usability of Clinical Decision Support System: Heuristic Evaluation

  • Hwayoung Cho; 
  • Gail Keenan; 
  • Olatunde O. Madandola; 
  • Fabiana Cristina Dos Santos; 
  • Tamara G. R. Macieira; 
  • Ragnhildur I. Bjarnadottir; 
  • Karen J.B. Priola; 
  • Karen Dunn Lopez

ABSTRACT

Background:

One of the primary causes of unintended consequences related to use of the Electronic Health Record systems (EHRs) that negatively impact patient safety is poor usability. Due to the cost and time needed to carry out iterative evaluations, many of the EHR components, such as Clinical Decision Support systems (CDSs) have not had rigorous usability testing prior to deployment in practice. Usability testing in the pre-deployment phase is crucial to eliminate usability issues and To present an example application of a systematic evaluation method that uses clinician experts with human computer interaction (HCI) expertise for evaluating usability of an electronic CDS intervention prior to deployment in a randomized controlled trial.to prevent costly fixes that will be needed if found post implementation.

Objective:

To present an example application of a systematic evaluation method that uses clinician experts with human computer interaction (HCI) expertise for evaluating usability of an electronic CDS intervention prior to deployment in a randomized controlled trial.

Methods:

Six HCI experts were invited to participate in a heuristic evaluation of the CDS intervention. Each expert was asked to independently explore the intervention at least twice. Following completion of the tasks using patient scenarios, each expert was asked to complete a Heuristic Evaluation Checklist developed by Bright et al. based on Nielsen’s ten heuristics. The expert also rated the overall severity for each of the identified areas of usability problems using a scale with five categories (no problem-0 to usability catastrophe-4). Data from the coded comments were synthesized, and the severity of each of the identified usability heuristics was analyzed.

Results:

The six HCI experts included four in nursing, one in pharmacy, and one in systems engineering. Mean scores of the overall severity of the identified heuristic violations ranged from 0.66 (Flexibility and Efficiency of Use) to 2.00 (User Control and Freedom and Error Prevention), in which scores closest to 0 indicate a more usable system. The heuristic principle, User Control and Freedom, identified as the most in need of refinement was considered major usability problems particularly by non-nursing-HCI experts. In response to the heuristic, Match between System and the Real World, experts pointed to the reversed direction in pain scale scores used in our system (1-severe) when compared with those commonly used in clinical practice (1-mild), which was identified as a minor usability problem, but its refinement was repetitively emphasized by nursing-HCI experts.

Conclusions:

Our heuristic evaluation process is simple, systematic and can be used at multiple stages of system development to reduce the time and cost needed to establish usability of a system before widespread implementation. Furthermore, the heuristic evaluation can help organizations develop transparent reporting on usability as required by Title IV of the 21st Century Cures Act. Testing of EHRs/CDSs by clinician experts with HCI expertise in the heuristic evaluation process can reduce the frequency/increase the quality of testing that will reduce clinicians’ cognitive workload and errors and enhance the likelihood of EHRs/CDSs adoption.


 Citation

Please cite as:

Cho H, Keenan G, Madandola OO, Dos Santos FC, Macieira TGR, Bjarnadottir RI, Priola KJ, Lopez KD

Assessing the Usability of a Clinical Decision Support System: Heuristic Evaluation

JMIR Hum Factors 2022;9(2):e31758

DOI: 10.2196/31758

PMID: 35536613

PMCID: 9090311

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.