Seven lawsuits filed against OpenAI by families of Canada mass-shooting victims
Seven families of victims killed or injured in a mass shooting in Canada have filed lawsuits against OpenAI and its CEO Sam Altman in a California court, accusing him and the company of ignoring the shooter's troubling interactions with ChatGPT.
Eight people were killed, including six children, when 18-year-old Jessie Van Rootselaar opened fire at a secondary school in the Tumbler Ridge, British Columbia, in February.
Media reports have since revealed that Van Rootselaar's ChatGPT activity was flagged by OpenAI's safety team months before the attack for references to gun violence, but the company did not alert local police.
Last week, Altman apologised to families of the victims.
"I am deeply sorry that we did not alert law enforcement," Altman wrote in an open letter published by local news outlet Tumbler RidgeLines.
"While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered."
In a statement responding to the lawsuits, an OpenAI spokesperson said the company has "a zero-tolerance policy for using our tools to assist in committing violence".
The spokesperson added that OpenAI had "already strengthened our safeguards", including better assessment and escalation of "potential threats of violence".
The company also published a blog on Tuesday outlining how OpenAI responds to users who display potentially dangerous behaviour on ChatGPT.
The new legal actions were filed in a California court on Wednesday by a joint legal team from the US and Canada.
It will replace a previous lawsuit filed in a Canadian court by the family of one surviving victim, 12-year-old Maya Gebala, which is being voluntarily withdrawn.
Gebala remains in hospital after being shot three times, in the head, neck and cheek.
Jay Edelson, the lawyer representing the families and community members in the new lawsuits, said he expects to file more than two dozen legal actions related to the shooting against OpenAI.
He added he will be requesting trials by jury in each case.
"We feel very comfortable making a case in front of a jury," he told the BBC.
For Gebala's case, lawyers will be seeking over $1bn (£740m) in damages, Edelson's firm told the BBC, with Edelson saying he expects the jury "to award historic amounts".
The lawsuits accuse OpenAI and its senior leadership, including Altman, of negligence and aiding and abetting the Tumbler Ridge mass shooting by failing to alert law enforcement of the suspect's ChatGPT activities prior to the attack.
One lawsuit naming Gebala and her family alleges that that OpenAI "had actual knowledge" of the shooter's intention to carry out an attack through conversations with ChatGPT, where the shooter described "scenarios involving gun violence".
The conversations were flagged by a 12-person safety team at OpenAI, who recommended that the suspect be reported to the Royal Canadian Mounted Police (RCMP), Edelson said.
Executive leadership at OpenAI, however, vetoed that decision, the lawsuit alleges.
It further alleges that OpenAI's senior leadership made the call not to alert police in order to protect the valuation and reputation of the $850bn (£630bn) company.
"They did the math and decided that the safety of the children of Tumbler Ridge was an acceptable risk," the lawsuit states.
It also alleges that OpenAI lied about the suspect being banned from the platform after the troubling activity was flagged, arguing that the company makes it easy for users to create new accounts.
The suspect, the lawsuit states, made another account under the same name and "continued using ChatGPT to plan the attack".
In a statement to the BBC, OpenAI refuted this and said it revokes access to its services from banned users, which may include disabling their account and taking steps to stop them from opening new accounts.
The suspect died in the 10 February attack from a self-inflicted gun shot wound.
Edelson told the BBC that he has requested the suspect's chat logs from OpenAI but was refused access, though he believes they will be obtained through the lawsuits.
"We're going to put the jury in the room when the decision was made to not tell the Canadian authorities," Edelson said.
"We're going to show them how people were jumping up and down saying we need to protect this town, and we're going to show them how Sam Altman and OpenAI routinely make these decisions to put their own interests first."
OpenAI had previously promised Canadian officials that it will strengthen its safety measures in response to the Tumbler Ridge attack.
Altman wrote in his letter that the company will continue to focus on "working with all levels of government to help ensure something like this never happens again".
OpenAI is also facing a criminal probe in Florida related to the use of ChatGPT by a man who is accused of carrying out last year a shooting at Florida State University. Two people were killed and several others were injured in the attack.