Currently submitted to: Journal of Medical Internet Research
Date Submitted: Mar 31, 2026
Open Peer Review Period: Apr 1, 2026 - May 27, 2026
(currently open for review)
Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
Mapping Practice-Based Signals of Generative AI in Psychiatric Care: A Qualitative Study of Korean Psychiatrists’ Experiences, Interpretations, and Implementation Priorities
ABSTRACT
Background:
Generative artificial intelligence (GenAI) has increasingly entered psychiatric practice through patient-facing chatbots, self-help tools, and clinician-facing workflow support. Although prior research has examined clinicians’ attitudes, readiness, and anticipated use cases, less is known about how frontline encounters with GenAI shape psychiatrists’ interpretations and implementation priorities. Healthcare foresight also remains methodologically underdeveloped and has focused mainly on external signals, overlooking clinically consequential signals emerging from everyday practice. This gap is especially important in psychiatry, where GenAI-related benefits and harms may depend on patient vulnerability, crisis sensitivity, and the therapeutic relationship.
Objective:
To qualitatively examine how South Korean psychiatrists described clinical experiences with GenAI, how they interpreted its roles and limits in psychiatric care, and what implementation priorities they emphasized. Selected concepts from horizon-scanning informed the organization of the analysis by orienting attention to practice-based signals, interpretive patterns, and implementation priorities.
Methods:
In this qualitative descriptive study, directed content analysis and codebook-based thematic synthesis were used to analyze responses to 3 open-ended survey questions administered to members of the Korean Neuropsychiatric Association. Invitations were distributed through the association’s official email system from October 27 to December 26, 2025. The qualitative analysis included respondents who provided an interpretable response to at least 1 item. The questions addressed (1) GenAI-related clinical experiences, (2) perceived advantages and limitations of chatbot-based AI relative to human therapists, and (3) priorities for the safe introduction of GenAI into mental health care. An exploratory participant-level cross-question thematic alignment analysis was also conducted to examine recurring adjacent-item pairings across the experience-interpretation-priority sequence.
Results:
Of 408 total survey respondents, 311 respondents provided a meaningful response to at least 1 open-ended item. Psychiatrists described GenAI as a clinically ambivalent technology whose implications depended on context, intensity of use, and patient vulnerability. Practice-based signals clustered around patient-led use, clinician-led use, GenAI as a relational object, and GenAI-mediated changes in the patient-clinician interface, with high-risk and destabilizing scenarios cutting across these domains. Experiences ranged from self-help, emotional reflection, triage, and workflow support to overreliance, conflict with clinical authority, reinforcement of distorted or delusion-like beliefs, and suicide- or self-harm-related risk. Respondents viewed GenAI as potentially useful as an adjunct, but also as relationally limited and unacceptable as a replacement for human therapists. Implementation priorities centered on governance, crisis and vulnerability safeguards, technical reliability and clinical validation, and education, supervision, and structural readiness. Cross-question analysis suggested recurrent alignments between frontline signals, a view of GenAI as standardized and tireless but relationally thin, and governance- and validation-oriented implementation priorities.
Conclusions:
In this qualitative descriptive study, GenAI emerged in psychiatric practice as an access tool, a workflow aid, and, at times, a competing interpretive reference point in clinical encounters. The key implementation challenge is therefore not whether psychiatry will encounter GenAI, but how its use should be bounded, supervised, and governed in light of patient vulnerability, psychiatric risk, and the relational demands of care. Clinical Trial: Clinical Research Information Service (CRIS) KCT0011712; https://cris.nih.go.kr/cris/search/detailSearch.do?seq=32747&search_page=M&search_lang=E&class_yn=
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.