Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Aug 8, 2025
Open Peer Review Period: Aug 11, 2025 - Oct 6, 2025
Date Accepted: Jan 30, 2026
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Insights and Recommendations From Moderators and Community Members for Keeping Online Peer Support Safe: Thematic Analysis

Jones HG, Lavelle G, Aylwin-Foster E, Regan C, Simpson A, Carr E, Hotopf M, Lawrence V

Insights and Recommendations From Moderators and Community Members for Keeping Online Peer Support Safe: Thematic Analysis

J Med Internet Res 2026;28:e81943

DOI: 10.2196/81943

PMID: 41819126

Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.

Best practice for keeping online peer support safe: Insights and recommendations from moderators and members of a new community.

  • Hannah Grace Jones; 
  • Grace Lavelle; 
  • Elly Aylwin-Foster; 
  • Ciara Regan; 
  • Alan Simpson; 
  • Ewan Carr; 
  • Matthew Hotopf; 
  • Vanessa Lawrence

ABSTRACT

Background:

Online peer support can help people living with long-term physical health conditions to manage their mental wellbeing. Although the potential negative events that can occur and risks associated with online forums are well recognised, our understanding of how best to moderate these spaces is relatively limited, particularly with regards to new communities. Previous work has focused on the experiences of either moderators or community members.

Objective:

We therefore sought to explore the perspectives of both members and moderators of a new online peer support community to evaluate the moderation procedures and inform recommendations for best practice.

Methods:

Community members (n=39) who participated in a research trial of a new online peer community, CommonGround, were interviewed. The moderation team (n=5) were invited to a focus group. Community member interviews explored their opinions of moderation policies and the behaviour of the moderation team. The moderator focus group explored their experiences of moderating the community, including perceived benefits, common challenges, and areas for improvement. All interviews and the focus group were conducted online, audio-recorded and transcribed verbatim. An inductive thematic analysis was conducted to sort the data into overarching themes through an iterative process.

Results:

Effective moderation was considered critical in creating a safe space that members wanted to engage with and for mitigating any risks, particularly around the spread of medical misinformation. Both moderators and community members felt that the moderation policies and practices were appropriate and applicable to the community. Moderators found navigating the moderation threshold, where they balanced safety against free speech, challenging when determining whether to intervene or not. Being part of a team with mixed clinical expertise helped moderators build confidence in navigating this threshold and also presented other benefits of easy access to support and improving the consistency of their moderation practices. It was suggested that in for a community to flourish, community members would self-moderate. However, moderators and members felt that the strong community culture and high levels of member engagement that are needed to support self-moderation had not yet evolved. Proposed improvements to moderation included new features to support the efficiency of identifying new content for review and reviewing the rule of anonymity.

Conclusions:

Moderation is critical in making online peer communities feel safe and engaging. Moderation practices should be co-produced with the target audience to ensure that they are aligned with the community’s unique moderation wants and needs, including clear escalation pathways, transparent communication patterns, and plans to review and update policies or procedures as the community evolves. There should be technological features that promote self-moderation, as the community may shift towards self-moderation as it matures. It is also critical to ensure that moderators feel supported so that they are best placed to support the broader community. Clinical Trial: n/a


 Citation

Please cite as:

Jones HG, Lavelle G, Aylwin-Foster E, Regan C, Simpson A, Carr E, Hotopf M, Lawrence V

Insights and Recommendations From Moderators and Community Members for Keeping Online Peer Support Safe: Thematic Analysis

J Med Internet Res 2026;28:e81943

DOI: 10.2196/81943

PMID: 41819126

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.