From the Editor

For patients with bipolar disorder, lithium is an important medication, shown to reduce hospitalizations and suicides better than newer agents. But has it fallen out of fashion? International reports suggest that it has.

In the first selection, Samreen Shafiq (of the University of Calgary) and her co-authors try to answer this question with Canadian data in a new paper for The Canadian Journal of Psychiatry. They draw on a decade of Alberta prescription data with more than 580,000 lithium scripts. “This population-based study suggests that the overall number of new and prevalent lithium users is decreasing in Alberta between the years of 2009 and 2018, but the observed pattern suggests that this decrease may have stopped by the end of our study interval.” We consider the paper and its clinical implications.

In the second selection, John W. Ayers (of University of California San Diego) considers AI-generated responses to health care questions posted on social media, like the need to seek medical attention after a minor head injury with a presentation of a lump and a headache. In a JAMA Internal Medicine paper, they compare ChatGPT answers to those of physicians in terms of quality and empathy. “In this cross-sectional study, a chatbot generated quality and empathetic responses to patient questions posed in an online forum.”

The debate over coercive care is hot, with proposals to rebalance patients’ rights actively discussed in New York City, Alberta, and California. In the third selection, author Daniel Bergner writes that we should look for alternatives to medications. In a New York Times essay, he argues that antipsychotics are problematic. “By doubling down on existing methods, we’re only beckoning more failure.”

DG

Selection 1: “Ten-Year Trends in Lithium Prescribing in Alberta, Canada”

Samreen Shafiq, Paul Everett Ronksley, Tayler Dawn Scory, et al

The Canadian Journal of Psychiatry, 24 May 2023  Online First

Lithium is currently considered first-line therapy for bipolar disorder (BD) and second-line adjunctive therapy for treatment-resistant depression. Prior work has shown it is especially effective for both initiation and maintenance therapy of manic and mixed episodes, with the added mitigation of suicidal behaviour. Population-level data suggests lithium use has been declining. This decrease in prescribing and use of lithium maybe due to reported adverse effects and therapeutic drug monitoring requirements. With availability of newer pharmacological treatments, physicians were given the opportunity to provide patients with other treatment options, bypassing some of lithium therapy’s inconveniences and disadvantages…

In Canada, a recent population-based study using data from four national surveys reported a pooled prevalence of lithium use of 0.2%… While these estimates suggest low use, it is not clear whether these findings suggest a change in lithium prescribing practices in Canada over time.

So begins a paper by Shafiq et al.

Here’s what they did:

“This study used provincial administrative health data from Alberta, Canada between January 1, 2009 and December 31, 2018. Lithium prescriptions were identified within the Pharmaceutical Information Network database. Total and subgroup specific frequencies of new and prevalent lithium use were determined over the 10-year study period. Lithium discontinuation was also estimated through survival analysis.”

Here’s what they found:

  • “Between the calendar years of 2009 and 2018, 580,873 lithium prescriptions were dispensed in Alberta to 14,008 patients.”
  • Demographics. 43.5% of lithium-treated patients were male and 56.5% were female. 11.9% were 18–24 years at the time of treatment initiation; 55.5%, 25–49 years; 25.3%, 50–64 years; and 7.3%, 65 and over.
  • Prevalence. “In 2009, approximately 130 prevalent users per 100,000 are observed, while by 2017 the number of prevalent users decreased to 117 per 100,000. However, the graphic suggests that the decline had stopped or perhaps even reversed by 2018.” See figure below.
  • Age and use. “Prevalent use of lithium was lowest among individuals between the ages of 18–24 years while the highest number of prevalent users were in the 50–64 age group, particularly among females.”
  • Discontinuation. “From 2009 to 2018, 61.6% patients discontinued lithium.” Also, “use declines rapidly in the first 365 days across all age groups and by sex.”

A few thoughts:

1. This is a good study.

2. The cup is half full and half empty. Lithium is underused, but its use probably isn’t declining over time. Indeed, there may be room for optimism: “the trend of new and prevalent lithium users follows a U-shaped pattern in the youngest age group, suggesting that although lithium use was decreasing until 2014, its use may be increasing in more recent years.”

3. Are there clinical implications? As the authors note: “The rate of discontinuation was very high in the first year of use, especially in young people.” Would better patient education help? Should doctors aim to problem solve early in lithium trials and meet with lithium patients more often?

4. Like all studies, there are limitations. The authors note several, including that: “although the PIN database captures 96% of prescription dispensed in community pharmacies, it does not confirm that prescriptions dispensed were used by the patients.”

The full CJP paper can be found here:

https://journals.sagepub.com/doi/full/10.1177/07067437231176905

Selection 2: “Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum”

John W. Ayers, Adam Poliak, Mark Dredze, et al.

JAMA Internal Medicine, 28 April 2023  Online First

The COVID-19 pandemic hastened the adoption of virtual health care, concomitant with a 1.6-fold increase in electronic patient messages, with each message adding 2.3 minutes of work in the electronic health record and more after-hours work. Additional messaging volume predicts increased burnout for clinicians with 62% of physicians, a record high, reporting at least 1 burnout symptom. More messages also makes it more likely that patients’ messages will go unanswered or get unhelpful responses.

Some patient messages are unsolicited questions seeking medical advice, which also take more skill and time to answer than generic messages (eg, scheduling an appointment, accessing test results). Current approaches to decreasing these message burdens include limiting notifications, billing for responses, or delegating responses to less trained support staff. Unfortunately, these strategies can limit access to high-quality health care….

ChatGPT represents a new generation of AI technologies driven by advances in large language models. ChatGPT reached 100 million users within 64 days of its November 30, 2022 release and is widely recognized for its ability to write near-human-quality text on a wide range of topics.

So begins a study by Ayers et al.

Here’s what they did:

“In this cross-sectional study, a public and nonidentifiable database of questions from a public social media forum (Reddit’s r/AskDocs) was used to randomly draw 195 exchanges from October 2022 where a verified physician responded to a public question. Chatbot responses were generated by entering the original question into a fresh session… The original question along with anonymized and randomly ordered physician and chatbot responses were evaluated in triplicate by a team of licensed health care professionals. Evaluators chose ‘which response was better’ and judged both ‘the quality of information provided’ (very poorpooracceptablegood, or very good) and ‘the empathy or bedside manner provided’ (not empatheticslightly empatheticmoderately empatheticempathetic, and very empathetic). Mean outcomes were ordered on a 1 to 5 scale and compared between chatbot and physicians.”

Here’s what they found:

  • The sample contained 195 randomly drawn exchanges with a unique member-patient’s question and unique physician’s answer. 
  • Length. “The mean length of patient questions in words averaged 180 (94-223). Mean physician responses were significantly shorter than the chatbot responses (52 words vs. 211 words…). A total of 182 (94%) of these exchanges consisted of a single message and only a single response from a physician.”
  • Preferred. “The evaluators preferred the chatbot response to the physician responses 78.6%… of the 585 evaluations.”
  • Quality. “Evaluators also rated chatbot responses significantly higher quality than physician responses… The mean rating for chatbot responses was better than good… while on average, physicians’ responses were rated 21% lower, corresponding to an acceptable response…”
  • Empathy. “Chatbot responses… were rated significantly more empathetic than physician responses…. Specifically, physician responses were 41% less empathetic than chatbot responses, which generally equated to physician responses being slightly empathetic and chatbot being empathetic.”

A few thoughts:

1. This is an interesting study – it’s not specific to psychiatry, but relevant to us in mental health care, of course.

2. A summary of the main finding: evaluators preferred the AI-generated responses over physician ones by a ratio of 4 to 1. Wow.

3. The authors see great clinical implications. “If more patients’ questions are answered quickly, with empathy, and to a high standard, it might reduce unnecessary clinical visits, freeing up resources for those who need them. Moreover, messaging is a critical resource for fostering patient equity, where individuals who have mobility limitations, work irregular hours, or fear medical bills, are potentially more likely to turn to messaging. High-quality responses might also improve patient outcomes.”

4. Like all studies, there are limitations. The authors identify several, including “use of the online forum question and answer exchanges.” They note: “Such messages may not reflect typical patient-physician questions. For instance, we only studied responding to questions in isolation, whereas actual physicians may form answers based on established patient-physician relationships.”

5. Since the launch of ChatGPT, there has been much discussion about the spectacular jobs that AI could do. But what about the tedious ones? Do physician want to spend time responding to simple questions?

The full JAMA Int Med paper can be found here:

https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2804309

Selection 3: “A Major Problem With Compulsory Mental Health Care Is the Medication”

Daniel Bergner

The New York Times, 2 June 2023

If severe mental illness, untreated, underlies the feeling of encroaching anarchy and menace around the homeless encampments of San Francisco or in the subways of New York City, then the remedy appears obvious. Let’s rescue those who, as New York’s mayor, Eric Adams, says, ‘slip through the cracks’ of our mental health care systems; let’s give people ‘the treatment and care they need.’

It sounds so straightforward. It sounds like a clear way to lower the odds of tragic incidents occurring, like the chokehold killing of Jordan Neely, a homeless, psychiatrically troubled man, or the death of Michelle Alyssa Go, who was pushed off a Times Square subway platform to her death by a homeless man suffering with schizophrenia. Improving order and safety in public spaces and offering compassionate care seem to be convergent missions.

But unless we confront some rarely spoken truths, that convergence will prove illusory.

So begins an essay by Bergner.

He focuses on New York City. “Mayor Adams has led a major push that would lower the standard for first responders to strap people to a gurney, load them into an E.M.T. van and take them to a hospital for psychiatric evaluation and possible commitment, against their will. He would also make it easier to channel them into court-mandated outpatient treatment.” 

But he argues that there are significant problems with antipsychotics. “Compulsory care is deeply problematic in itself, but is made more so by the medications at its core. This isn’t to suggest that antipsychotics should not be prescribed for people enduring psychosis. It is to say that the drugs shouldn’t be considered – as they tend to be now – the required linchpin of treatment. Antipsychotics probably reduce hallucinations and delusions for around 60 percent of those who take them, but the science around their efficacy is far from definitive and some studies (though not all) indicate that long-term maintenance on the drugs may worsen outcomes.”

“Science hasn’t made great strides in antipsychotics since the drugs were first introduced seven decades ago… [T]hey often have profound side effects: mental torpor, major weight gain, tics, spasms and a condition called akathisia, an overall jitteriness, as if a mad puppeteer is fighting perpetually for control of the person’s body.”

He also argues that forcing medications is not without risk. “Imagine being cut off from society by a tormented psyche and extreme poverty and then being hauled off to an emergency room, forcibly injected with a powerful drug like Haldol and held in a locked ward until being dispatched into a compulsory outpatient program. Will this set the stage for a stable life? Or will it add to people’s trauma, sense of isolation and lack of agency – and lead to their slipping away from whatever program they’re ordered into and back toward dire instability?”

A way forward? “We’re going to have to think less fearfully and more creatively, genuinely seeking the counsel of people who’ve learned to cope, in varied ways, with their psychiatric conditions. Beyond the bottom line of adequate housing, we’ll need to embrace approaches that may seem hazy in contrast to the chemistry of pharmaceuticals, but that can be the best hope for recovery. This will mean funding and fostering the kinds of supportive communities like Fountain House and the group meetings of the Hearing Voices Network, which combat isolation and despair with an emphasis on sharing experiences and solutions…”

A few thoughts:

1. This is a well-written essay.

2. Compulsory care is a difficult and nuanced subject. The comments about the invasive nature of such care are relevant.

3. How strong is his alternative agenda? 

4. The debate over coercive care is complicated. Past Readings have considered other essays on this topic, including one from journalist Anna Mehler Paperny in The Globe and Mail. It can be found here: 

https://davidgratzer.com/reading-of-the-week/reading-of-the-week-polypharmacy-also-melatonin-gummies-jama-mehler-paperny-on-involuntary-care-globe/

The full NYT essay can be found here:

https://www.nytimes.com/2023/06/02/opinion/compulsory-mental-health-care-medication.html

Reading of the Week. Every week I pick articles and papers from the world of Psychiatry.