From the Editor

There are more medication options than ever for the treatment of bipolar disorder. What are physicians prescribing? How often do we use lithium, arguably the best medication?

In the first selection, from The Canadian Journal of Psychiatry, Samreen Shafiq (of the University of Calgary) and her co-authors attempt to answer those questions in a new study. They drew on Alberta government data, including more than 130 000 individuals with bipolar disorder and more than nine million prescriptions. “Overall, we uncovered a concerning trend in the prescribing patterns for bipolar disorder treatment, with antidepressants and second-generation antipsychotics being prescribed frequently and a decline in prescribing of lithium and other mood stabilizers.” We consider the paper and its implications.

What would John Cade think?

In the second selection, Dr. Allen Frances (of Duke University) writes about AI chatbots and psychotherapy in The British Journal of Psychiatry. He notes their “remarkable fluency” and argues that there are clear benefits to AI psychotherapy. He also comments on dangers, and he doesn’t mince his words. “Artificial intelligence is an existential threat to our profession. Already a very tough competitor, it will become ever more imposing with increasing technical power, rapidly expanding clinical experience and widespread public familiarity.”

And in the third section, Sophie Li (of the University of Ottawa) and her co-authors consider psychosis and cannabis in a concise CMAJ paper. They make several points, including: “The tetrahydrocannabinol (THC) content of cannabis has roughly quintupled in the past 2 decades, from around 4% in the 2000s to more than 20% in most legal dried cannabis in Canada by 2023.”

There will be no Reading next week.

DG

Selection 1: “Prescribing Trends for Bipolar Disorder Drugs in Alberta, Canada Between 2008 and 2021”

Samreen Shafiq, Paul Everett Ronksley, Meghan Jessica Elliott, et al.

The Canadian Journal of Psychiatry, September 2025

Recent advancements in bipolar disorder (BD) treatment offer patients a large range of management and treatment options. Along with older drugs such as lithium and divalproex, second-generation antipsychotics (SGA) are currently recommended treatments for both acute and maintenance treatment of manic and depressive episodes. Simultaneously, cautious use of antidepressants is permitted for treatment of acute depressive episodes in bipolar I disorder (BD I) according to recommendations…

Prior drug utilization studies revealed a decline in the use of older options in favour of these newer therapies. In the US, from 1996 to 2016, antipsychotic prescriptions for BD patients in outpatient settings rose five-fold, while the use of other mood stabilizers dropped from 62.3% to 26.4%. Antidepressant prescriptions remained steady, with around 50% of patients consistently receiving them throughout the study period… Although there are several monotherapies recommended for acute and maintenance treatment of BD episodes, combination therapies may be required in treatment-resistant cases. A systematic review by Fornaro et al. found that 85% of BD patients use two or more drugs, and 36% patients use four or more drugs concurrently, potentially increasing the risk of side effects and drug interactions… 

A Canadian study (2002–2010) showed a significant rise in prescribers’ preference for atypical antipsychotics for BD, despite comprehensive clinical guidelines offering various treatment options. This raises concerns about a potential gap between recommended practices and real-world drug use.

So begins a paper by Shafiq et al.

  • They drew on provincial administrative health data from Alberta, Canada, identifying people with bipolar disorder by looking at ICD codes in different databases.
  • All adults were included.
  • They then “identified prevalent, new and combination use of commonly prescribed BD drugs through prescription information from the Pharmaceutical Information Network database.” For prevalent use, “we identified patients with at least one BD drug prescription within a given quarter.” For new users, “the first quarter of use of every BD drug was identified as those with a first prescription and no record of a prescription for the same drug in the prior year.”
  • They looked at specific medications. “We specifically identified prescriptions of aripiprazole, asenapine, carbamazepine, divalproex, lamotrigine, lithium, lurasidone, olanzapine, quetiapine and risperidone, bupropion, selective serotonin reuptake inhibitors (SSRI), tricyclic antidepressants (TCA) and other antidepressants (including serotonin and norepinephrine reuptake inhibitors (SNRI), serotonin modulators, and noradrenergic and specific serotonergic antidepressants).”

Here’s what they found:

  • “Between April 1, 1994, and March 31, 2021, 136 628 individuals had at least 1 code of BD with 9 466 407 prescriptions dispensed between January 1, 2008 to March 31, 2021.”
  • Demographics and use. Most patients were female (57.2%) and young (12.7% were 18-24 years at the time of treatment initiation). New users of all drugs declined over time.
  • Medications. Among all BD drugs, “antidepressants were the most commonly prescribed in both prevalent and new users throughout the study period.”
  • Recommendations. Among recommended treatments for BD, for prevalent users, quetiapine was “one of the most prescribed drugs.” 
  • Mood stabilizers. “An overall decline was noted in prescribing of lithium, divalproex and carbamazepine among prevalent and new users.”
  • Agents. Most individuals were prescribed a single drug. 
  • Combinations. The most common combination therapy for prevalent users was an antidepressant with a second-generation antipsychotic (SGA).

A few thoughts:

1. This is a good study, drawing on a robust dataset, and published in a solid journal.

2. The main finding: “There is a gap between recommended practice and real-world use.”

3. More: “First, antidepressant treatment rates, particularly as monotherapy, were higher than expected. Second, SGAs were utilized much more frequently than lithium or divalproex. Notably, lamotrigine treatment rates increased. Finally, we observed a decline in new initiations of all medications, especially after 2020.”

4. Ouch, ouch, and ouch. A gap, indeed.

5. Let’s focus on lithium for a moment. The authors write that: “Treatment rates of lithium and divalproex were similar at the beginning of the study period (around 38 per 1,000 in the first quarter of 2008). However, treatment rates of lithium decreased to 31 per 1,000 by 2021, while treatment rates of divalproex increased to 45 per 1,000 by 2011 and then decreased to 41 by 2021.”

With regard to lithium and new users, they offer data:

The authors comment that “lithium remains a cornerstone mood stabilizer” with more evidence than divalproex but “divalproex continues to have a higher prevalence of both new and prevalent users compared to lithium.” Why is lithium less prescribed? What needs to be done in terms of education so that clinicians are more comfortable with this med?

6. The authors make several recommendations, including a call for more quality improvement initiatives.

7. Like all studies, there are limitations, including that the Alberta data may not be generalizable.

The full Canadian Journal of Psychiatry paper can be found here:

https://journals.sagepub.com/doi/full/10.1177/07067437251355643

Selection 2: “Warning: AI chatbots will soon dominate psychotherapy”

Allen Frances

The British Journal of Psychiatry, 20 August 2025  Online First

Joseph Weizenbaum, a pioneering computer scientist, created the first chatbot in 1966. He named it ELIZA, after the simple flower girl in Pygmalion, who, after intense language training, convinces aristocrats at a ball that she’s one of them. Weizenbaum’s goal was to explore whether computer programs might someday succeed in a similar impersonation – speaking so naturally with humans that the latter wouldn’t realise the conversation was really with a machine. ELIZA impersonated a Rogerian non-directive therapist, asking open-ended questions expanding upon the human’s previous statement. It was an extremely primitive program…Weizenbaum was surprised and horrified when people loved interacting with ELIZA, personified it and gushed about its ‘empathy’. Realising that he was greasing a dangerous, slippery slope towards computer dominance, Weizenbaum immediately abandoned all work on ELIZA, gave up any attempt to pass the Turing test and instead spent the remaining 40 years of his life warning everyone who would listen that chatbots pose a grave threat to humanity.

ELIZA attained its popularity despite being mechanical, repetitive, stereotyped and uninformative. Modern therapy chatbots are credible, often excellent, therapists – and there is little limit to future improvement as artificial intelligence gains technical power and clinical experience. Weizenbaum would be terrified, but not surprised, that modern chatbots have suddenly become remarkably powerful and ubiquitous… 

So begins a paper by Dr. Frances. He notes the strengths of AI.

Interpersonal skill

“I played with ELIZA soon after it appeared in the late 1960s and found it to be dull and dim-witted. In stark contrast, the recent chatbot therapy sessions I’ve reviewed have all been good, some brilliant. Chatbots are colloquial and lively in their speech, adjusting flexibly to the user’s style, tone and vocabulary. Questions, statements and interpretations were accurate, concise and well timed. Had I not known the therapists were machines, I would have assumed they were highly skilled and experienced human clinicians.”

Therapeutic alliance

“Artificial intelligence chatbots aim to please. Their algorithmic DNA places highest priority on user engagement. Users consistently report feeling understood and validated, that the artificial intelligence therapist is empathic and really cares about them.”

Knowledge base

“Artificial intelligence therapists know everything about everything and are informative about anything the patient needs to know. They are great at psychoeducation, identifying resources, understanding the specific demands of the user’s work situation and applying different therapy techniques appropriate to the problem at hand.”

He warns of dangers. Here, we focus on three.

Overenthusiasm

“Enthusiasm about the benefits of artificial intelligence therapy for some patients should not blind us to its enormous risks for others. Existing chatbots are mostly trained to deal with milder forms of anxiety and depression – the most common presenting symptoms, offering the most available training material and the biggest potential market. They are not trained to deal with the more severe and unusual problems seen in clinical practice.”

Privacy

“There’s no guaranteed privacy in today’s internet world. As extensive and very personal data are collected from more and more people, the dangers of unauthorised use, identity theft, ransomware, malware, blackmail, bullying and scamming all escalate exponentially.” 

Lack of safety/efficacy quality control

“Artificial intelligence companies are rewarded for their usage rates, not for the quality, consistency, safety and efficacy of their therapy products. They are generally not required to report transparently on adverse consequences, mistakes and the weird artificial intelligence behaviours that periodically occur. We have a long clinical experience with human psychotherapy and an extensive research literature on its efficacy and safety… In contrast, artificial intelligence chatbots are for-profit business ventures, created by techies, not clinicians, and are not tested for safety and efficacy in rigorously controlled clinical trials.”

Despite these dangers, he sees AI therapy growing rapidly. “Human therapists cannot really expect to compete with artificial intelligence for most healthy patients and people with everyday problems. I can envision a day, not too distant in the future, when almost everyone consults an artificial intelligence coach/therapist many times a day, most days of the week.”

He closes with recommendations.

  • Playing to strength.  “To survive the chatbot onslaught, human therapists must capitalise on our superior intuition and interpersonal creativity and enhance our skills in those things artificial intelligence cannot do well, i.e. treating patients with more severe, complex or uncommon problems; working with children and seniors; managing emergencies; working in special settings (e.g. hospitals, prisons, the military); handling chaotic situations and those that change rapidly…”
  • Family therapy. “I would also hope for a revival of family therapy, terribly neglected in recent decades because of the excessive focus on individuals promoted by DSM and insurance companies.”
  • Training. “Therapists must be trained or retrained for work with the types of patients who are unsuitable for artificial intelligence therapy, for techniques artificial intelligence cannot do (e.g. family and group) and for settings too novel or complex for artificial intelligence chatbots.”

A few thoughts:

1. This is an excellent commentary on AI chatbots.

2. If you read one paper on AI chatbots this year, this is the paper.

3. I aspire to be that lucid and thoughtful at 83 years of age.

4. A strong point that is worth repeating: “Enthusiasm about the benefits of artificial intelligence therapy for some patients should not blind us to its enormous risks for others.”

5. Given the mixed results of AI to date, is he overestimating it?

The full British Journal of Psychiatry paper can be found here:

https://www.cambridge.org/core/journals/the-british-journal-of-psychiatry/article/warning-ai-chatbots-will-soon-dominate-psychotherapy/DBE883D1E089006DFD07D0E09A2D1FB3

Selection 3: “Cannabis and psychosis”

Sophie Li, Marco Solmi, Daniel T. Myran, and Nicholas Fabiano

CMAJ, 11 August 2025

This short paper – part of the Practice series – makes several clinically relevant points about cannabis and psychosis. Here, we focus on three.

People with cannabis-induced psychosis and cannabis use disorder are at high risk of schizophrenia

“A population-based retrospective cohort study of 9.8 million people in Ontario, Canada, found that people with an emergency department visit for cannabis use or cannabis-induced psychosis were at a 14.3-fold and 241.6-fold higher risk of developing a schizophrenia-spectrum disorder within 3 years than the general population, respectively.”

High-potency and regular cannabis use are associated with elevated risk of psychosis

“The lifetime occurrence of cannabis-induced psychosis symptoms is estimated to be 0.47% among people who use cannabis. The risk of cannabis-induced psychosis is elevated among those using high-potency THC (a product with > 10% THC) and those who use cannabis frequently, are younger, or are male. Evidence also suggests that a history of a mental disorder (e.g., bipolar disorder, depression, anxiety) increases the risk of psychosis.”

Behavioural interventions may aid in cannabis cessation

“Motivational interviewing can increase treatment engagement, while cognitive behavioural therapy can build skills to resist cravings and urges to use cannabis. These interventions can be delivered by physicians or psychologists, and lead to reduction in cannabis use, reduction of psychiatric symptom burden, and improvement of psychosocial functioning.”

A few thoughts:

1. This is a solid and clinically relevant paper.

2. The observation about potency is striking: cannabis potency has surged in recent years.

3. Was it a mistake for Canada to legalize cannabis for medical reasons before recreational use, perhaps creating the impression that it is safe?

The full CMAJ paper can be found here:

https://www.cmaj.ca/content/197/27/E810

Reading of the Week. Every week I pick articles and papers from the world of Psychiatry.