From the Editor
In March, medical assistance in dying will be expanded in Canada to include those with mental illness. Not surprisingly, many people feel strongly about it, with some seeing the change as a natural extension of basic rights and others arguing that it will be a profound mistake.
What do patients and family members think? How does it relate to their views of suicide in general? Lisa Hawke (of the University of Toronto) and her co-authors attempt to answer these questions in a new Canadian Journal of Psychiatry paper. They do a qualitative analysis, interviewing 30 people with mental illness and 25 family members on medical assistance in dying when the sole underlying medical condition is mental illness (or MAiD MI-SUMC). “Participants acknowledge the intersections between MAiD MI-SUMC and suicidality and the benefits of MAiD MI-SUMC as a more dignified way of ending suffering, but also the inherent complexity of considering [such] requests in the context of suicidality.” We consider the paper and its implications.
In the second selection, Dr. Scott Monteith (of Michigan State University) and his co-authors write about artificial intelligence and misinformation in a new British Journal of Psychiatry paper. They note the shift in AI – from predictive models to generative AI – and its implications for patients. “Misinformation created by generative AI about mental illness may include factual errors, nonsense, fabricated sources and dangerous advice.”
And in the third selection, writer Shannon Palus discusses the rise of “mental health merch” – clothing items and other merchandise that tout mental health problems, including a pricey sweatshirt with “Lexapro” written on the front (the US brand name for escitalopram). In Slate, Palus discusses her coolness to such things. “As a person who struggles with her own mental health, as a Lexapro taker – well, I hate this trend, honestly! I find it cloying and infantilizing.”
Note that there will be no Reading next week.