Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Nov 15, 2022
Open Peer Review Period: Nov 15, 2022 - Jan 10, 2023
Date Accepted: Mar 10, 2023
(closed for review but you can still tweet)
Consensus on the terms and procedures for planning and reporting usability evaluation of health-related digital solutions: a Delphi study and a resulting checklist
ABSTRACT
Background:
Usability evaluation both by experts and target users is an integral part of the process of developing and assessing digital solutions. Usability evaluation improves the probability of having digital solutions that are easier, safer, more efficient, and more pleasant to use. However, and despite the widespread recognition of the importance of usability evaluation, there is a lack of research and consensus on related concepts and reporting standards.
Objective:
To generate consensus on terms and procedures that should be considered when planning and reporting a study on usability evaluation of health-related digital solutions both by users and experts.
Methods:
A Delphi study with two rounds was conducted with a panel of international participants experienced in usability evaluation. In the first round, they were asked to comment on definitions, rate the importance of pre-identified methodological procedures using a 9-item Likert scale, and suggest additional procedures. In the second round, experienced participants were asked to reappraise the relevance of each procedure informed by round one results. Consensus on the relevance of each item was defined a priori as when at least 70% or more experienced participants scored an item seven to nine and less than 15% of participants scored the same item one to three.
Results:
A total of 30 participants from 11 different countries entered the Delphi study with a mean age (±SD) of 37.2±7.7 years old (20 females). Agreement was achieved on the definitions for all usability evaluation-related terms proposed (usability assessment moderator, participant, usability evaluation method, usability evaluation technique, tasks, usability evaluation environment, usability evaluator and domain evaluator). A total of 38 procedures related to usability evaluation planning and reporting were identified across rounds (28 were related to usability evaluation involving users and 10 were related to usability evaluation involving experts). Consensus on the relevance was achieved for 23 (82%) of the procedures related to usability evaluation involving users and for 7 (70%) of usability evaluation procedures involving experts.
Conclusions:
This study proposes a set of terms and respective definitions as well as a checklist to guide the planning and reporting of usability evaluation studies, constituting an important step towards a more standardized approach in the field of usability evaluation that may contribute to enhancing the quality of planning and reporting usability studies. Future studies can contribute to further validating this study work by refining the definitions, assessing the practical applicability of the current checklist for specific digital solutions, or by assessing whether using this checklist results in higher-quality digital solutions.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.