Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Medical Education

Date Submitted: Jun 27, 2021
Date Accepted: Mar 31, 2022

The final, peer-reviewed published version of this preprint can be found here:

Video-Based Communication Assessment of Physician Error Disclosure Skills by Crowdsourced Laypeople and Patient Advocates Who Experienced Medical Harm: Reliability Assessment With Generalizability Theory

White AA, King AM, D’Addario AE, Brigham KB, Dintzis S, Fay EE, Gallagher TH, Mazor KM

Video-Based Communication Assessment of Physician Error Disclosure Skills by Crowdsourced Laypeople and Patient Advocates Who Experienced Medical Harm: Reliability Assessment With Generalizability Theory

JMIR Med Educ 2022;8(2):e30988

DOI: 10.2196/30988

PMID: 35486423

PMCID: 9107044

Comparison of Crowdsourced Laypeople and Patient Advocates who Experienced Medical Harm with a Video-based Communication Assessment of Physician Error Disclosure Skills: Reliability Assessment with Generalizability Theory

  • Andrew A White; 
  • Ann M King; 
  • Angelo E D’Addario; 
  • Karen Berg Brigham; 
  • Suzanne Dintzis; 
  • Emily E Fay; 
  • Thomas H Gallagher; 
  • Kathleen M Mazor

ABSTRACT

Background:

Physicians need simulated practice with personalized feedback to prepare for disclosure conversations with patients after harmful errors. Ideally, feedback would come from patients who have experienced communication after medical harm, but this approach is impractical at scale. Until recently, reliable, standardized, and affordable assessment tools were lacking. The Video-based Communication Assessment (VCA) is a novel tool designed to engage crowdsourced laypeople to rate physician communication skills, but has not been validated for medical harm scenarios.

Objective:

Compare the reliability of two assessment groups in the rating of physician error disclosure communication skills using the VCA.

Methods:

Internal medicine residents completed a VCA case depicting a delayed diagnosis of breast cancer, consisting of three sequential vignettes. Panels of patient advocates who have experienced harmful medical error, either themselves or through a family member, and crowdsourced laypeople rated the audio samples on overall communication using a 5-point scale. We used analysis of variance (ANOVA) to compare the two rating groups in terms of stringency and correlation to identify whether rank order would be preserved between groups. We then used generalizability theory to examine the difference in assessment reliability between the two rating groups.

Results:

Twenty internal medicine residents completed the VCA. All eight patient advocates and 42 of 59 crowdsourced laypeople provided high quality ratings. Patient advocates rated communication more stringently than crowdsourced laypeople on a 5-point scale (3.19 (SD .55) vs. 3.55 (SD .40), P<.001), but the correlation between patient advocates’ ratings of physicians and the crowdsourced laypeople’s ratings of physicians was high, r(19) = .82, P< .001. Reliability (G Coefficient) for 8 raters and 6 vignettes was 0.82 for patient advocates versus 0.65 for crowdsourced laypeople. Decision studies estimated that twelve crowdsourced layperson raters and 9 vignettes would yield a G coefficient of 0.75.

Conclusions:

Crowdsourced laypeople offer a sustainable and reliable solution to rating VCA cases for error disclosure skills, correctly identifying high and low performers. However, at least 12 raters and 9 vignettes are required to ensure adequate reliability. Crowdsourced laypeople rate less stringently than raters who have experienced harm. Subsequent research should examine the role of VCA in multiple possible aspects of error disclosure instruction, including formative assessment, summative assessment, and just-in-time coaching. Clinical Trial: N/A


 Citation

Please cite as:

White AA, King AM, D’Addario AE, Brigham KB, Dintzis S, Fay EE, Gallagher TH, Mazor KM

Video-Based Communication Assessment of Physician Error Disclosure Skills by Crowdsourced Laypeople and Patient Advocates Who Experienced Medical Harm: Reliability Assessment With Generalizability Theory

JMIR Med Educ 2022;8(2):e30988

DOI: 10.2196/30988

PMID: 35486423

PMCID: 9107044

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.