Reading of the Week: Trump on Trump & the Goldwater Rule; Also, Chatbots Reviewed (CJP)

From the Editor

Should psychiatrists comment on the possible mental health problems of President Donald Trump? His niece thinks so.

In our first selection, we consider an essay by Mary L. Trump. In this Washington Post essay, the psychologist discusses the Goldwater Rule, which prevents members of the American Psychiatric Association from commenting on political figures. Trump feels that psychiatrists should speak up. “Adopting a notionally neutral stance in this case doesn’t just create a void where professional expertise should be – it serves to normalize dysfunctional behavior.” We consider the essay and the questions it raises.

barrygoldwater-58fab94e5f9b581d596956b2Goldwater of the Goldwater Rule

In the other selection, we pick another current topic, but this time we draw from a journal, not a newspaper, considering a new paper from The Canadian Journal of Psychiatry. Aditya Nrusimha Vaidyam (of Harvard University) and his co-authors do a review of chatbots for mental health; that is, “digital tools capable of holding natural language conversations and mimicking human-like behavior in task-oriented dialogue with people” (think Siri or Alexa, but for mental disorders). “This review revealed few, but generally positive, outcomes regarding conversational agents’ diagnostic quality, therapeutic efficacy, or acceptability.”


Continue reading

Reading of the Week: “Talking to Machines About Personal Mental Health Problems.” JAMA on Therapy & AI

From the Editor

Will people seek therapy with computers one day, getting care from programs built with Artificial Intelligence?

The authors of a new JAMA paper consider this in a short, clever piece, titled “Talking to Machines About Personal Mental Health Problems.”

In this two-part Reading of the Week series, we look at two papers, both published in JAMA. These Viewpoint pieces make interesting, provocative arguments.

This week, we look at conversational agents.

Next week, we ask: is CBT really the gold standard for psychotherapy?

Stanford University’s Adam S. Miner and his co-authors consider conversational agents – that is software programs that “use conversational artificial intelligence to interact with users through voice or text.” Could there be therapeutic value in such a program? What are the ethical challenges?

Robot and human hands almost touching - 3D render. A modern take on the famous Michelangelo painting in the Sistine Chapel; titled, "The Creation of Adam".

In this Reading, we review the paper, and consider the potential of conversational agents, with an eye on what’s currently available.


Continue reading