Accepted for/Published in: JMIR Formative Research
Date Submitted: Jan 21, 2025
Date Accepted: Sep 3, 2025
Can Chatbots Have Empathetic Conversations: A Qualitative Analysis of Responses from 8 Commercial Conversational Agents to Help-Seeking Queries on Depressive Moods
ABSTRACT
Background:
While recent studies showed the potential of conversation agents to help alleviate depressive moods, the dynamics of user-chatbot interactions in mental health support remain underexplored.
Objective:
We examine real-world conversations between users and chatbots on depression-related topics to identify patterns in how users seek help and how chatbots provide therapeutic support. We assessed the responses of eight commercial chatbots to user queries about depressive moods and evaluated whether they employed therapeutic communication techniques.
Methods:
Our method has two parts. First, we analyzed 13,700 utterances (6,850 user queries and 6,850 responses) about depressive moods from the commercial chatbot SimSimi, covering five English-speaking countries between 2016 and 2021. We classified user queries into five groups based on Rickwood’s help-seeking model and classified chatbot responses into eight therapeutic communication styles. Next, we evaluated the responses of three voice assistants (Amazon’s Alexa, Google Assistant, and Apple’s Siri) and five chatbots (ChatGPT, Replika, Woebot, Wysa, and SimSimi) to user queries about depressive moods.
Results:
In Study 1, we examined how SimSimi, a social chatbot trained to encourage users to share their emotions and build rapport, responded to user queries. The majority (75.3%) indicated depressed feelings, and a smaller portion (4.1%) sought strategies to cope with depression. The chatbot's responses were largely therapeutic (75%), demonstrating empathy (29%), active listening (26.9%), and open-ended questions (21.8%). Study 2 compared a wide range of social bots, revealing that Replika expressed empathy in over 75% of its responses, similar to SimSimi. In contrast, Alexa (88.2%), Google Assistant (60%), Siri (55.6), and ChatGPT (95.2%) typically responded to depression-related queries with search results rather than offering specific solutions for depressive feelings. Mental health chatbots like Woebot responded to users with clarification questions (97.3%). We also report instances where conversational agents failed to meet users' help-seeking needs, instead giving irrelevant responses and ignoring emotional requests.
Conclusions:
Our findings reveal a mixed landscape in the emotional support provided by conversational agents. While some social chatbots delivered empathetic responses that fostered deeper user engagement, most commercial chatbots offered merely informative replies to users' help-seeking inputs. Recognizing that users seek support from chatbots, we recommend equipping next-generation conversational agents with capabilities grounded in therapeutic communication, such as empathetic responses.
Citation
Request queued. Please wait while the file is being generated. It may take some time.