Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Jul 26, 2022
Open Peer Review Period: Dec 8, 2022 - Feb 8, 2023
Date Accepted: May 26, 2023
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

Chen KY, Lang Y, Zhou Y, Kosmari L, Lamichhane S, Daniel K, Gurses A, Xiao Y

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

J Med Internet Res 2023;25:e41431

DOI: 10.2196/41431

PMID: 37440308

PMCID: 10375278

Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.

Assessing Interventions on Crowdsourcing Platforms for Improving Patient Behaviors in Primary Care Settings

  • Kay-Yut Chen; 
  • Yan Lang; 
  • Yuan Zhou; 
  • Ludmila Kosmari; 
  • Sanjog Lamichhane; 
  • Kathryn Daniel; 
  • Ayse Gurses; 
  • Yan Xiao

ABSTRACT

Background:

The principles of behavioral economics (BE) suggest that there are many potential ways to develop meaningful health care partnerships with patients. Crowdsourced experimental surveys may help efficiently assess the time and cost needed for different options.

Objective:

The goals of this study were (1) to assess the feasibility of using crowdsourced surveys to evaluate BE interventions for patient partnerships, and (2) to assess the impact of two BE-based intervention designs, psychological rewards and loss of framing, on simulated medication reconciliation behaviors in a simulated primary care setting.

Methods:

We conducted between-subject survey experiments on a crowdsourcing platform (Amazon Mechanical Turk) to assess the design of behavioral interventions. Interventions were aimed at improving a targeted behavior that pertains to bringing medicines to primary care office visits. The baseline and three simulated interventions were compared in simulated primary care office visit scenarios. Interventions were monetary compensation, status effect as a psychological reward, and loss frame as a modification of the status effect. Willingness to bring medicines was measured on a 5-point Likert scale. A reverse coding question was included to assess response intentionality.

Results:

A total of 569 study participants were recruited. There were 132 in the baseline group, 187 in the monetary compensation group, 149 in the psychological reward group, and 101 in the loss framing group. All three interventions increased participants’ willingness to bring medicines significantly when compared to the baseline scenario. The monetary compensation intervention caused an increase of 13.06% (P<.001), psychological rewards increased willingness by 6.53% (P=.025), and a loss frame on the psychological rewards increased willingness by 16.80% (P<.001). Responses to the reverse coding question were consistent with the willingness questions.

Conclusions:

In primary care, bringing medications to office visits is a frequently advocated patient partnership behavior that is nonetheless not widely adopted. Crowdsourcing platforms such as MTurk support efforts to efficiently and rapidly reach large groups of individuals to assess the efficacy of behavioral interventions. We found that crowdsourced survey-based experiments with simulated monetary compensation resulted in valid simulated behavioral responses. Simulated psychological status design, especially with a loss framing design, had a significant impact on the targeted behavior. It should thus be considered an effective behavioral intervention design to enhance patient engagement in primary care. These results support the use of crowdsourcing platforms to augment and complement traditional approaches to learning about behavioral economics for patient engagement.


 Citation

Please cite as:

Chen KY, Lang Y, Zhou Y, Kosmari L, Lamichhane S, Daniel K, Gurses A, Xiao Y

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

J Med Internet Res 2023;25:e41431

DOI: 10.2196/41431

PMID: 37440308

PMCID: 10375278

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.