From the Editor
As patients struggle to access care, some are looking to AI for psychotherapy. Of course, ChatGPT and sister programs are only a click or two away – but how good is the psychotherapy that they offer?
In a new American Journal of Psychotherapy paper, Dr. Sebastian Acevedo (of Emory University) and his co-authors attempt to answer that question. Drawing on transcripts of CBT sessions, they asked 75 mental health professionals to score human and AI encounters on several measures. So how did ChatGPT fare? “The findings suggest that although ChatGPT-3.5 may complement human-based therapy, this specific implementation of AI lacked the depth required for stand-alone use.” We consider the paper and its implications.

In the second selection, from JMIR Mental Health, Dr. Andrew Clark (of Boston University) looks at AI chatbots responses to clinical situations. Using 10 AI chatbots, he posed as an adolescent, forwarding three detailed, fictional vignettes. The results are surprising. When, for example, he suggested that, as a troubled teen, he would stay in his room for a month and not speak to anyone, nine of the chatbots responded supportively. “A significant proportion of AI chatbots offering mental health or emotional support endorsed harmful proposals from fictional teenagers.”
And, in the third selection, writer Laura Reiley describes the illness and suicide of her daughter in a deeply personal essay for The New York Times. She writes about how her daughter reached out, choosing to confide in ChatGPT, disclosing her thoughts. “ChatGPT helped her build a black box that made it harder for those around her to appreciate the severity of her distress.”
DG
Recent Comments