Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Nov 15, 2022
Open Peer Review Period: Nov 15, 2022 - Jan 10, 2023
Date Accepted: Mar 10, 2023
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist

Martins AI, Santinha G, Almeida AM, Ribeiro Ã, Silva T, Rocha N, Silva AG

Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist

J Med Internet Res 2023;25:e44326

DOI: 10.2196/44326

PMID: 37279047

PMCID: 10282913

Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.

Consensus on the terms and procedures for usability evaluation – a Delphi study

  • Ana Isabel Martins; 
  • Gonçalo Santinha; 
  • Ana Margarida Almeida; 
  • Óscar Ribeiro; 
  • Telmo Silva; 
  • Nelson Rocha; 
  • Anabela G. Silva

ABSTRACT

Background:

Terminology and studies on usability lack common terminology and reporting standards.

Objective:

To generate consensus on terms and procedures that should be considered when planning and reporting a study on usability evaluation of health-related digital solutions both by users and experts.

Methods:

A Delphi study with two rounds was conducted with a panel of 30 international participants experienced in usability evaluation. In the first round, they were asked to comment on definitions and rate the importance of pre-identified procedures using a 9-item Likert scale, and suggest additional procedures. In the second round, experienced participants were asked to reappraise the relevance of each procedure informed by round one results.

Results:

Agreement was achieved on the definitions of a set of usability evaluation-related terms and a total of 38 procedures related to usability evaluation planning and reporting were identified. Consensus on the relevance was achieved for 30 (79%) of the 38 procedures.

Conclusions:

This work provides an important step towards a more standardized approach in the field of usability evaluation that may contribute to enhancing the quality of planning and reporting usability studies.


 Citation

Please cite as:

Martins AI, Santinha G, Almeida AM, Ribeiro Ã, Silva T, Rocha N, Silva AG

Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist

J Med Internet Res 2023;25:e44326

DOI: 10.2196/44326

PMID: 37279047

PMCID: 10282913

The author of this paper has made a PDF available, but requires the user to login, or create an account.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.