Canada school shooting: Victims' families sue OpenAI, Sam Altman over alleged chatbot role in Tumbler Ridge's deadliest attack

by · The News International
Canada school shooting: Victims' families sue OpenAI, Sam Altman over alleged chatbot role in Tumbler Ridge's deadliest attack

Canada's mass shooting victims' families have accused OpenAI over the incident.

Family members of victims who faced the deadliest mass shooting sued OpenAI and CEO Sam Altman in U.S. court on Wednesday, alleging the company identified the shooter as a credible threat eight months before the attack but did not warn police.

Advertisement

The lawsuits, filed in federal court in San Francisco, accuse OpenAI leaders of not alerting police ‌because it would have exposed the volume of violence-related conversations on ChatGPT and potentially jeopardized the company's path to a nearly $1 trillion initial public offering.

The February shooting in Tumbler Ridge, British Columbia, left nine people dead, many of them children.

An OpenAI spokesperson called the shooting “a tragedy" and said the company has a zero-tolerance policy for using its tools to assist in committing violence.

“As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators,” the spokesperson said in a statement.

The cases are part of a growing wave of lawsuits accusing artificial intelligence companies of failing to prevent chatbot interactions that plaintiffs say contribute to self-harm, mental illness and violence.

They appear to be the first in the Canada to allege that ChatGPT played a role in facilitating a mass shooting.

Jay Edelson, who is representing the plaintiffs, said he plans to file another two dozen lawsuits in the coming weeks against the company on behalf of other people impacted by the shooting.

Lawsuit claims OpenAI safety team overruled:

Jesse Van Rootselaar, whose interactions with ChatGPT are at the center of the lawsuits, shot her mother and stepbrother at home before killing an educational assistant and five students aged 12 to 13 at her former school on February 10, according to police. Van Rootselaar, who was 18, then died by suicide.

The plaintiffs include relatives of those killed at the school and a 12-year-old girl who survived after being shot three times but remains in intensive care.

According to one of the complaints, OpenAI's automated systems in June 2025 flagged ChatGPT conversations in which the shooter described gun violence scenarios.

Safety team members recommended contacting the police after concluding she posed a credible and imminent threat of harm, said the complaint, which cites a Wall Street Journal article from February about the company's internal ‌discussions.

But Altman and other OpenAI leadership overruled the safety team and police were never called, the lawsuit alleges. The shooter's account was deactivated, but she was able to get a new account and continue using the platform to plan her attack, the lawsuit claims.

Following the publication of the Wall Street Journal article, the company said the account was flagged by systems that identify "misuses of our models in furtherance of violent activities" but the issues did not meet its internal criteria for reporting to law enforcement.

Last week, a local Tumbler Ridge newspaper published an open letter in which Altman said he was "deeply sorry" the account was not flagged to law enforcement.

In a blog published Tuesday, OpenAI said it trains its models to refuse requests that could "meaningfully enable violence" and notifies law enforcement when conversations suggest "an imminent and credible risk of harm to others," with mental health experts helping assess borderline cases. The company said it continually refines its models and detection methods based on usage and expert input.

The lawsuits seek an unspecified amount of damages and a court order requiring OpenAI to overhaul its safety practices, including mandatory law enforcement referral protocols. One of the victims originally filed her lawsuit in Canadian court but dismissed it to pursue her claims in California, Edelson said.

OpenAI faces Multiple lawuits:

The lawsuits over the Tumbler Ridge shooting come after multiple lawsuits against OpenAI have been filed in U.S. state and federal courts in recent months over claims ChatGPT facilitated harmful behavior, suicide, and, in at least one case, a murder-suicide.

The lawsuits, which are still in early phases, will force courts to grapple with what role an AI platform can play in promoting violence and whether the company can be held liable for its actions or the actions of its users.

OpenAI has denied the claims in the lawsuits, arguing in the murder-suicide case that the perpetrator had a long history of mental illness.

Florida Attorney General James Uthmeier announced earlier this month a criminal investigation into ChatGPT's role in a 2025 shooting at Florida State University.