Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Mental Health

Date Submitted: Apr 14, 2024
Open Peer Review Period: Apr 15, 2024 - Jun 10, 2024
Date Accepted: Aug 3, 2024
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Empowering Mental Health Monitoring Using a Macro-Micro Personalization Framework for Multimodal-Multitask Learning: Descriptive Study

Meishu S, Yang Z, Triantafyllopoulos A, Zhang Z, Takeuchi H, Nakamura T, Kishi A, Ishizawa T, Yoshiuchi K, Schuller B, Yoshiharu Y

Empowering Mental Health Monitoring Using a Macro-Micro Personalization Framework for Multimodal-Multitask Learning: Descriptive Study

JMIR Ment Health 2024;11:e59512

DOI: 10.2196/59512

PMID: 39422993

PMCID: 11530727

Empowering Mental Health Monitoring: Macro-Micro Personalization Framework for Multimodal-Multitask Learning

  • Song Meishu; 
  • Zijiang Yang; 
  • Andreas Triantafyllopoulos; 
  • Zixing Zhang; 
  • Hiroki Takeuchi; 
  • Toru Nakamura; 
  • Akifumi Kishi; 
  • Tetsuro Ishizawa; 
  • Kazuhiro Yoshiuchi; 
  • Bjoern Schuller; 
  • Yamamoto Yoshiharu

ABSTRACT

Background:

The field of mental health technology presently has significant gaps that need addressing, particularly in the domain of daily monitoring and personalized assessments. Current non-invasive devices like wristbands and smartphones are capable of collecting a wide range of data, which has not yet been fully utilized for mental health monitoring.

Objective:

The paper aims to introduce a novel dataset for Personalized Daily Mental Health Monitoring and a new Macro-Micro Framework. This framework is designed to employ multimodal and multitask learning strategies for improved personalization and prediction of emotional states in individuals.

Methods:

Data was collected from 242 individuals using wristbands and smartphones, capturing physiological signals, speech data, and self-annotated emotional states. The proposed framework combines macro-level emotion transformer embeddings with micro-level personalization layers specific to each user. It also introduces a dynamic restrained uncertainty weighting method to effectively integrate various data types for a balanced representation of emotional states. Several fusion techniques, personalization strategies, and multitask learning approaches were explored.

Results:

The proposed framework was evaluated using the Concordance Correlation Coefficient (CCC), resulting in a score of 0.503. This result demonstrates the framework's efficacy in predicting emotional states.

Conclusions:

The paper concludes that the proposed multimodal and multitask learning framework, which leverages transformer-based techniques and dynamic task weighting strategies, is superior for the personalized monitoring of mental health. The study indicates the potential of transforming daily mental health monitoring into a more personalized application, opening up new avenues for technology-based mental health interventions.


 Citation

Please cite as:

Meishu S, Yang Z, Triantafyllopoulos A, Zhang Z, Takeuchi H, Nakamura T, Kishi A, Ishizawa T, Yoshiuchi K, Schuller B, Yoshiharu Y

Empowering Mental Health Monitoring Using a Macro-Micro Personalization Framework for Multimodal-Multitask Learning: Descriptive Study

JMIR Ment Health 2024;11:e59512

DOI: 10.2196/59512

PMID: 39422993

PMCID: 11530727

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.