Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Formative Research

Date Submitted: Jan 21, 2025
Date Accepted: Sep 3, 2025

The final, peer-reviewed published version of this preprint can be found here:

Chatbots’ Empathetic Conversations and Responses: A Qualitative Study of Help‑Seeking Queries on Depressive Moods Across 8 Commercial Conversational Agents

Chin H, Baek G, Cha C, Cha M

Chatbots’ Empathetic Conversations and Responses: A Qualitative Study of Help‑Seeking Queries on Depressive Moods Across 8 Commercial Conversational Agents

JMIR Form Res 2025;9:e71538

DOI: 10.2196/71538

PMID: 41284964

PMCID: 12643404

Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.

Can Chatbots Have Empathetic Conversations? Analyzing Responses from 8 Commercial Conversational Agents to Help-Seeking Queries on Depressive Moods

  • Hyojin Chin; 
  • Gumhee Baek; 
  • Chiyoung Cha; 
  • Meeyoung Cha

ABSTRACT

Background:

While recent studies showed the potential of conversation agents to help alleviate depressive moods, the dynamics of user-chatbot interactions in mental health support remain underexplored.

Objective:

We examine real-world conversations between users and chatbots on depression-related topics to identify patterns in how users seek help and how chatbots provide therapeutic support. We assessed the responses of eight commercial chatbots to user queries about depressive moods and evaluated whether they employed therapeutic communication techniques.

Methods:

Our method has two parts. First, we analyzed 13,700 utterances (6,850 user queries and 6,850 responses) about depressive moods from the commercial chatbot SimSimi, covering five English-speaking countries between 2016 and 2021. We classified user queries into five groups based on Rickwood’s help-seeking model and classified chatbot responses into eight therapeutic communication styles. Next, we evaluated the responses of three voice assistants (Amazon’s Alexa, Google Assistant, and Apple’s Siri) and five chatbots (ChatGPT, Replika, Woebot, Wysa, and SimSimi) to user queries about depressive moods.

Results:

In Study 1, we examined how SimSimi, a social chatbot trained to encourage users to share their emotions and build rapport, responded to user queries. The majority (75.3%) indicated depressed feelings, and a smaller portion (4.1%) sought strategies to cope with depression. The chatbot's responses were largely therapeutic (75%), demonstrating empathy (29%), active listening (26.9%), and open-ended questions (21.8%). Study 2 compared a wide range of social bots, revealing that Replika expressed empathy in over 75% of its responses, similar to SimSimi. In contrast, Alexa (88.2%), Google Assistant (60%), Siri (55.6), and ChatGPT (95.2%) typically responded to depression-related queries with search results rather than offering specific solutions for depressive feelings. Mental health chatbots like Woebot responded to users with clarification questions (97.3%). We also report instances where conversational agents failed to meet users' help-seeking needs, instead giving irrelevant responses and ignoring emotional requests.

Conclusions:

Our findings reveal a mixed landscape in the emotional support provided by conversational agents. While some social chatbots delivered empathetic responses that fostered deeper user engagement, most commercial chatbots offered merely informative replies to users' help-seeking inputs. Recognizing that users seek support from chatbots, we recommend equipping next-generation conversational agents with capabilities grounded in therapeutic communication, such as empathetic responses.


 Citation

Please cite as:

Chin H, Baek G, Cha C, Cha M

Chatbots’ Empathetic Conversations and Responses: A Qualitative Study of Help‑Seeking Queries on Depressive Moods Across 8 Commercial Conversational Agents

JMIR Form Res 2025;9:e71538

DOI: 10.2196/71538

PMID: 41284964

PMCID: 12643404

Per the author's request the PDF is not available.