OpenAI faces lawsuit in California court claiming chatbot gave advice that led to fatal overdose
· CNA · JoinRead a summary of this article on FAST.
Get bite-sized news via a new
cards interface. Give it a try.
Click here to return to FAST Tap here to return to FAST
FAST
May 12 : The parents of a man who died of an accidental drug overdose last year sued OpenAI and its founder and CEO Sam Altman in a California court on Tuesday, alleging the man was coached to take a dangerous combination of substances by ChatGPT.
Leila Turner-Scott and Angus Scott say their son, Sam Nelson, 19, was using a chatbot for guidance on combining different drugs. They say it encouraged him to take the prescription drug Xanax to treat nausea caused by kratom, an herbal product with opioid-like effects.
The combination of those drugs and alcohol resulted in Nelson’s death in May 2025, according to the lawsuit, which was filed in state court in San Francisco.
The lawsuit seeks monetary damages and asks the court to pause OpenAI’s rollout of ChatGPT Health, a platform the company announced in January that allows users to upload medical records and receive personalized health advice.
CNA Games
Guess Word
Crack the word, one row at a time
Buzzword
Create words using the given letters
Mini Sudoku
Tiny puzzle, mighty brain teaser
Mini Crossword
Small grid, big challenge
Word Search
Spot as many words as you can
Show More
Show Less
Currently, ChatGPT users can join a waitlist to access ChatGPT Health. A report from OpenAI released in January showed that overall, 40 million users ask ChatGPT healthcare-related questions daily.
Drew Pusateri, a spokesperson for OpenAI, called the situation heartbreaking, and said the interactions took place on an earlier version of ChatGPT that is no longer available. The company is continuously working to improve ChatGPT’s safety, he said.
“ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts,” Pusateri said. “The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests, and guide users to real-world help.”
The lawsuit is the latest wrongful death action to be brought against an AI company and comes a little more than a day after the family of a man killed in a mass shooting at Florida State University sued OpenAI alleging that ChatGPT helped the shooter plan the attack.
GROWING WAVE OF AI LAWSUITS
Generative AI companies are facing a growing number of lawsuits accusing them of failing to prevent chatbot interactions that plaintiffs say contribute to self-harm, mental illness and violence.
The lawsuit filed by Nelson's parents claims he was initially rebuffed by ChatGPT when he asked for advice on drug use, with the chatbot saying it could not assist and warning about the risks. But in 2024 the company launched ChatGPT-4o, which began giving Nelson information about drug interactions and dosing in authoritative language that mimicked a doctor, the lawsuit said.
CHATBOT'S EVOLVING RESPONSES
The chatbot told Nelson how to source illicit substances, advised him on which drug to take next and made suggestions based on the experiences Nelson said he was looking for, according to the lawsuit. The chatbot saved details about Nelson’s substance use in its memory, allowing it to offer more personalized recommendations, according to the filing.
The lawsuit accuses OpenAI of rushing out ChatGPT-4o to keep up with competitors like Alphabet's Google, skipping needed safety testing. It accuses the company of designing a flawed product and failing to warn users of its risks, and cites a California law that bars AI companies from claiming a chatbot autonomously caused harm to a person as a defense.
“In California, if plaintiffs prove they were harmed by defendants’ AI-powered product, defendants will be liable for that harm, no matter how clever, independent, willful, spiteful, uncontrolled, rebellious, free-spirited, libertine, stochastic, or autonomous the beast they have birthed may be,” the lawsuit said.
Newsletter
Week in Review
Subscribe to our Chief Editor’s Week in Review
Our chief editor shares analysis and picks of the week's biggest news every Saturday.
Sign up for our newsletters
Get our pick of top stories and thought-provoking articles in your inbox
Get the CNA app
Stay updated with notifications for breaking news and our best stories
Get WhatsApp alerts
Join our channel for the top reads for the day on your preferred chat app