Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Oct 17, 2018
Open Peer Review Period: Oct 25, 2018 - Dec 20, 2018
Date Accepted: Apr 2, 2019
(closed for review but you can still tweet)
Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
Development and Evaluation of ClientBot: Patient-Like Conversational Agent to Train Basic Counseling Skills
Background:
Training therapists is both expensive and time-consuming. Degree–based training can require tens of thousands of dollars and hundreds of hours of expert instruction. Counseling skills practice often involves role-plays, standardized patients, or practice with real clients. Performance–based feedback is critical for skill development and expertise, but trainee therapists often receive minimal and subjective feedback, which is distal to their skill practice.
Objective:
In this study, we developed and evaluated a patient-like neural conversational agent, which provides real-time feedback to trainees via chat–based interaction.
Methods:
The text–based conversational agent was trained on an archive of 2354 psychotherapy transcripts and provided specific feedback on the use of basic interviewing and counseling skills (ie, open questions and reflections—summary statements of what a client has said). A total of 151 nontherapists were randomized to either (1) immediate feedback on their use of open questions and reflections during practice session with ClientBot or (2) initial education and encouragement on the skills.
Results:
Participants in the ClientBot condition used 91% (21.4/11.2) more reflections during practice with feedback (P<.001) and 76% (14.1/8) more reflections after feedback was removed (P<.001) relative to the control group. The treatment group used more open questions during training but not after feedback was removed, suggesting that certain skills may not improve with performance–based feedback. Finally, after feedback was removed, the ClientBot group used 31% (32.5/24.7) more listening skills overall (P<.001).
Conclusions:
This proof-of-concept study demonstrates that practice and feedback can improve trainee use of basic counseling skills.
Citation
Per the author's request the PDF is not available.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.