Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Nov 15, 2022
Open Peer Review Period: Nov 15, 2022 - Jan 10, 2023
Date Accepted: Mar 10, 2023
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist

Martins AI, Santinha G, Almeida AM, Ribeiro Ã, Silva T, Rocha N, Silva AG

Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist

J Med Internet Res 2023;25:e44326

DOI: 10.2196/44326

PMID: 37279047

PMCID: 10282913

Consensus on the terms and procedures for planning and reporting usability evaluation of health-related digital solutions: a Delphi study and a resulting checklist

  • Ana Isabel Martins; 
  • Gonçalo Santinha; 
  • Ana Margarida Almeida; 
  • Óscar Ribeiro; 
  • Telmo Silva; 
  • Nelson Rocha; 
  • Anabela G. Silva

ABSTRACT

Background:

Usability evaluation both by experts and target users is an integral part of the process of developing and assessing digital solutions. Usability evaluation improves the probability of having digital solutions that are easier, safer, more efficient, and more pleasant to use. However, and despite the widespread recognition of the importance of usability evaluation, there is a lack of research and consensus on related concepts and reporting standards.

Objective:

To generate consensus on terms and procedures that should be considered when planning and reporting a study on usability evaluation of health-related digital solutions both by users and experts.

Methods:

A Delphi study with two rounds was conducted with a panel of international participants experienced in usability evaluation. In the first round, they were asked to comment on definitions, rate the importance of pre-identified methodological procedures using a 9-item Likert scale, and suggest additional procedures. In the second round, experienced participants were asked to reappraise the relevance of each procedure informed by round one results. Consensus on the relevance of each item was defined a priori as when at least 70% or more experienced participants scored an item seven to nine and less than 15% of participants scored the same item one to three.

Results:

A total of 30 participants from 11 different countries entered the Delphi study with a mean age (±SD) of 37.2±7.7 years old (20 females). Agreement was achieved on the definitions for all usability evaluation-related terms proposed (usability assessment moderator, participant, usability evaluation method, usability evaluation technique, tasks, usability evaluation environment, usability evaluator and domain evaluator). A total of 38 procedures related to usability evaluation planning and reporting were identified across rounds (28 were related to usability evaluation involving users and 10 were related to usability evaluation involving experts). Consensus on the relevance was achieved for 23 (82%) of the procedures related to usability evaluation involving users and for 7 (70%) of usability evaluation procedures involving experts.

Conclusions:

This study proposes a set of terms and respective definitions as well as a checklist to guide the planning and reporting of usability evaluation studies, constituting an important step towards a more standardized approach in the field of usability evaluation that may contribute to enhancing the quality of planning and reporting usability studies. Future studies can contribute to further validating this study work by refining the definitions, assessing the practical applicability of the current checklist for specific digital solutions, or by assessing whether using this checklist results in higher-quality digital solutions.


 Citation

Please cite as:

Martins AI, Santinha G, Almeida AM, Ribeiro Ã, Silva T, Rocha N, Silva AG

Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist

J Med Internet Res 2023;25:e44326

DOI: 10.2196/44326

PMID: 37279047

PMCID: 10282913

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.