Accepted for/Published in: Interactive Journal of Medical Research
Date Submitted: Nov 19, 2018
Date Accepted: Jun 12, 2019
Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
Readability and Quality of Online Information on Osteoarthritis: An Objective Analysis With Historic Comparison
Background:
Osteoarthritis (OA) is the most common cause of disability in people older than 65 years. Readability of online OA information has never been assessed. A 2003 study found the quality of online OA information to be poor.
Objective:
The aim of this study was to review the readability and quality of current online information regarding OA.
Methods:
The term osteoarthritis was searched across the three most popular English language search engines. The first 25 pages from each search engine were analyzed. Duplicate pages, websites featuring paid advertisements, inaccessible pages (behind a pay wall, not available for geographical reasons), and nontext pages were excluded. Readability was measured using Flesch Reading Ease Score, Flesch-Kincaid Grade Level, and Gunning-Fog Index. Website quality was scored using the Journal of the American Medical Association (JAMA) benchmark criteria and the DISCERN criteria. Presence or absence of the Health On the Net Foundation Code of Conduct (HONcode) certification, age of content, content producer, and author characteristics were noted.
Results:
A total of 37 unique websites were found suitable for analysis. Readability varied by assessment tool from 8th to 12th grade level. This compares with the recommended 7th to 8th grade level. Of the 37, 1 (2.7%) website met all 4 JAMA criteria. Mean DISCERN quality of information for OA websites was “fair,” compared with the “poor” grading of a 2003 study. HONcode-endorsed websites (43%, 16/37) were of a statistically significant higher quality.
Conclusions:
Readability of online health information for OA was either equal to or more difficult than the recommended level.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.