Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Jul 26, 2022
Open Peer Review Period: Dec 8, 2022 - Feb 8, 2023
Date Accepted: May 26, 2023
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

Chen KY, Lang Y, Zhou Y, Kosmari L, Daniel K, Gurses A, Xiao Y

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

J Med Internet Res 2023;25:e41431

DOI: 10.2196/41431

PMID: 37440308

PMCID: 10375278

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings

  • Kay-Yut Chen; 
  • Yan Lang; 
  • Yuan Zhou; 
  • Ludmila Kosmari; 
  • Kathryn Daniel; 
  • Ayse Gurses; 
  • Yan Xiao

ABSTRACT

Background:

Engaging patients in health behaviors is critical for better outcomes, yet many patient partnership behaviors are not widely adopted. Behavioral economics-based interventions offer potential solutions, but it is challenging to assess the time and cost needed for different options. Crowdsourcing platforms can efficiently and rapidly assess the efficacy of such interventions, but it is unclear if online participants respond to simulated incentives in the same way as they would to actual incentives.

Objective:

The goals of this study were (1) to assess the feasibility of using crowdsourced surveys to evaluate behavioral economics interventions for patient partnerships by examining whether online participants responded to simulated incentives in the same way they would have responded to actual incentives, and (2) to assess the impact of two behavioral economics-based intervention designs, psychological rewards and loss of framing, on simulated medication reconciliation behaviors in a simulated primary care setting.

Methods:

We conducted a randomized controlled trial using a between-subject design on a crowdsourcing platform (Amazon Mechanical Turk) to evaluate the effectiveness of behavioral interventions designed to improve medication adherence in primary care visits. The study included a control group that represented the participants’ baseline behavior, and three simulated interventions, namely monetary compensation, a status effect as a psychological reward, and a loss frame as a modification of the status effect. Participants’ willingness to bring medicines to primary care visit was measured on a 5-point Likert scale. A reverse coding question was included to ensure response intentionality.

Results:

A total of 569 study participants were recruited. There were 132 in the baseline group, 187 in the monetary compensation group, 149 in the psychological reward group, and 101 in the loss frame group. All three nudge interventions increased participants’ willingness to bring medicines significantly when compared to the baseline scenario. The monetary compensation intervention caused an increase of 17.51% (P<.001), psychological rewards on status increased willingness by 11.85% (P<.001), and a loss frame on the psychological rewards increased willingness by 24.35% (P<.001). Responses to the reverse coding question were consistent with the willingness questions.

Conclusions:

In primary care, bringing medications to office visits is a frequently advocated patient partnership behavior that is nonetheless not widely adopted. Crowdsourcing platforms such as MTurk support efforts to efficiently and rapidly reach large groups of individuals to assess the efficacy of behavioral interventions. We found that crowdsourced survey-based experiments with simulated incentives can produce valid simulated behavioral responses. The use of psychological status design, particularly with a loss framing approach, can effectively enhance patient engagement in primary care. These results support the use of crowdsourcing platforms to augment and complement traditional approaches to learning about behavioral economics for patient engagement.


 Citation

Please cite as:

Chen KY, Lang Y, Zhou Y, Kosmari L, Daniel K, Gurses A, Xiao Y

Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

J Med Internet Res 2023;25:e41431

DOI: 10.2196/41431

PMID: 37440308

PMCID: 10375278

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.