Accepted for/Published in: JMIR Mental Health
Date Submitted: May 7, 2025
Date Accepted: Feb 10, 2026
Designing and Evaluating Digital Mental Health Interventions: A Scoping Review
ABSTRACT
Background:
The ongoing adoption and use of digital interventions offer promising opportunities to meet the growing demand for mental health support. The effectiveness, implementation, and usage of these interventions depend on how well they are designed and evaluated. However, given the emerging nature of design research in this area, there is still no clear consensus on the specific principles and guidelines for developing digital mental health interventions. There seems to be a lack of clarity regarding the best practices for designing and evaluating these tools.
Objective:
Our aim is to investigate and report on the design principles and evaluation approaches employed in digital interventions specific to mental healthcare. Additionally, we seek to outline how these principles and approaches are applied in research.
Methods:
This scoping review was conducted in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines for scoping reviews. The literature search was performed in two electronic databases, SCOPUS and Web of Science, across three iterations from January 2024 to January 2025. Two independent reviewers screened and selected papers based on predefined inclusion and exclusion criteria, followed by data extraction from the selected studies. The data were then synthesised by categorising the papers according to the primary research aim of each study. The inclusion criteria covered studies involving populations with mental health challenges or users of digital mental health interventions, any digital tools for mental health care, and principles or strategies related to the design, evaluation or implementation of digital mental health interventions.
Results:
Our search identified 401 papers, of which 17 met the inclusion criteria for this review. Among these, 11 focused on evaluation studies, while 6 covered both design and evaluation studies (mixed). Iterative user-centred development process, expert inclusion, usability testing, specification of design elements, and user tracking and feedback were identified as common design principles used in DMHI studies. Evaluation approaches were shaped by the evaluation goal, which influenced the chosen methodologies. We also summarise the recommendations for implementation highlighted in some studies. Based on our findings, we propose eight guidelines emphasising stakeholder involvement in the development process and the need for clear justifications for design decisions, among other considerations.
Conclusions:
Design principles used in DMHI development include user-centred development, expert inclusion, and usability testing, while evaluation approaches often rely on randomised controlled trials (RCTs) to assess efficacy, qualitative and mixed-method approaches are commonly adopted by studies to capture user experience and bridge both process and outcome measures. We recommend that future research explicitly report their design justification and adopt a multi-perspective approach in the research and design of DMHI.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.