Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Aug 9, 2023
Date Accepted: Sep 29, 2023
The Potential of Chatbots for Emotional Support and Promoting Mental Well-Being in Different Cultures
ABSTRACT
Background:
Research into AI chatbots has primarily focused on technical advances and validating their effectiveness at decreasing depression symptoms and increasing user engagement. However, real-world chat conversations related to depressive moods remain unexplored. Therefore, further analysis of the uses and effects of chatbots is urgently required.
Objective:
Here we investigate whether and how chatbots facilitate the expression of user emotions, specifically sadness, and depression. We also examine cultural differences in the expression of depressive moods among Western and Eastern chatbot users.
Methods:
The study used SimSimi, the world's largest open-domain social chatbot, to analyze 152,783 conversation utterances containing the terms "depress" and "sad" in three Western countries (Canada, the United Kingdom, and the United States) and five Eastern countries (Indonesia, India, Malaysia, the Philippines, and Thailand). The Linguistic Inquiry and Word Count (LIWC) and N-gram techniques were used to compare cultural variations (Study 1). A semi-supervised learning method was also used to determine the most prevalent categories of depressive mood in chatbots (Study 2).
Results:
We find that chatbots can provide helpful information in discussions about depressive moods, especially for users who have difficulty communicating emotions to other humans. Individuals are more likely to express emotional vulnerability related to depressive or sad moods to chatbots (50%) than on social media (8%). Eastern users tend to send more emotionally intense messages, both positive and negative, than Western users when using chatbots (positive = p< .001, negative = p< .05). For example, eastern users use more words associated with sadness (p< .05). However, western users use more words related to vulnerable topics such as mental health (p< .001). This group also has a greater tendency to break taboos, such as using swear (p< .001) and discussing death (p< .001). Chatbot conversations tend not to broach topics that require social support from others, such as seeking advice on daily life difficulties, unlike social media conversations that often include such discussions. However, chatbot users seem to want conversational agents that exhibit active listening skills and foster a safe space where they can openly share emotional states such as sadness or depression. We observe that user seek help for depressive moods, share emotional messages, and seek information related to depression with the chatbot.
Conclusions:
The findings highlight the potential for chatbot-assisted mental health support, emphasizing the importance of continued academic efforts to improve chatbot interactions for those in need of emotional assistance. Chatbots are potentially an economical, user-friendly, and patient-centered digital platform for mental health care providers, especially as the pandemic has left longer-term mental health issues. Clinical Trial: Not available
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.