Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR mHealth and uHealth

Date Submitted: Dec 13, 2020
Date Accepted: Mar 20, 2021

The final, peer-reviewed published version of this preprint can be found here:

Assessing the Quality of Mobile Health-Related Apps: Interrater Reliability Study of Two Guides

Miró J, Llorens-Vernet P

Assessing the Quality of Mobile Health-Related Apps: Interrater Reliability Study of Two Guides

JMIR Mhealth Uhealth 2021;9(4):e26471

DOI: 10.2196/26471

PMID: 33871376

PMCID: 8094021

On the Assessment of the Quality of mHealth-related Apps: an Interrater Reliability Study of Two Guides

  • Jordi Miró; 
  • Pere Llorens-Vernet

ABSTRACT

Background:

Background:

There are a huge number of health-related apps available, and the numbers are growing fast. However, many of them have been developed without any kind of quality control. In an attempt to contribute to the development of high quality apps and enable existing ones to be assessed, several guides have been developed.

Objective:

Objective:

The main aim of this study was to study the interrater reliability of a new guide – the Mobile App Development and Assessment Guide (MAG) – and compare it with one of the most used guides in the field, the Mobile App Rating Scale (MARS).

Methods:

Methods:

In order to study the interrater reliability of MAG and MARS, we evaluated the four most downloaded chronic health apps for Android and IOS devices. A group of eight reviewers, which included different types of stakeholders such as clinical researchers, engineers, health-care professionals, and end-users as potential patients, independently evaluated the quality of the apps using MAG and the MARS. We used Krippendorff's alpha to calculate the interrater reliability.

Results:

Results:

Only a few categories of MAG and MARS demonstrated a high interrater reliability. Although MAG was found to be superior, there was considerable variation in the scores between the different types of reviewer. The categories with the highest interrater reliability in MAG were “Security” (alpha = 0.78) and “Privacy” (alpha = 0.73).

Conclusions:

Conclusions:

This study shows that some categories of MAG have significant interrater reliability and that the MAG scores are better than the MARS scores. However, there is great variability in the responses, which seems to be associated with subjective interpretation by the reviewers.


 Citation

Please cite as:

Miró J, Llorens-Vernet P

Assessing the Quality of Mobile Health-Related Apps: Interrater Reliability Study of Two Guides

JMIR Mhealth Uhealth 2021;9(4):e26471

DOI: 10.2196/26471

PMID: 33871376

PMCID: 8094021

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.