Families sue OpenAI over Canadian mass shooter's use of ChatGPT
A lawsuit filed Wednesday claims that OpenAI was negligent for failing to report the shooter to authorities after her account was flagged for "gun violence activity and planning."
Paige Taylor White/AFP via Getty Images
hide caption
toggle caption
Paige Taylor White/AFP via Getty Images
Families of those injured and killed in a school shooting in Tumbler Ridge, British Columbia are suing OpenAI for negligence and providing a dangerously defective version of ChatGPT to the shooter.
The six suits, filed in federal court in San Francisco, allege that OpenAI failed to take actions that could have prevented injuries and deaths in the shooting, which took place on February 10. They claim that the company failed to report the shooter's conversations with ChatGPT to authorities, and that ChatGPT itself was a defective product that did not challenge the shooter or direct her to seek real-world help.
The suits are the latest seeking to hold a tech company responsible over the design of its products, a once-novel legal approach that is being increasingly used against chatbot makers, social media and other platforms.
For those who lost loved ones "there's nothing that the legal system can do that will make them whole again," Edelson told NPR in an interview. He added that they hope the trial will hold OpenAI leadership to account: "They should not be trusted to have the most powerful consumer technology on the planet."
In a statement in response to the lawsuits, OpenAI said it had a "zero tolerance" policy for using its tools to assist in committing violence:
"We have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources," an OpenAI spokesperson told NPR in an email.
In a lengthy blog post published late Tuesday, OpenAI further explained its policies: "When conversations indicate an imminent and credible risk of harm to others, we notify law enforcement."
"Profit over lives"
The shooting at Tumbler Ridge is among the deadliest in Canadian history. It occurred when Jesse Van Rootselaar, 18, entered the local secondary school with a long gun and a modified handgun, according to authorities. Van Rootselaar proceeded to kill five students and a teacher before killing herself. Authorities later learned that she had also killed her mother and 11-year-old half-brother at their home prior to coming to the school. Around two-dozen others were injured in the attack.
The lawsuits filed on Wednesday alleges that ChatGPT, and specifically the model GPT-4o, played a crucial role in the events at Tumbler Ridge. One of the complaints, on behalf of Maya Gebala, a 12-year-old grievously injured in the shooting, alleges that Van Rootselaar was on ChatGPT months before the shooting, and that in June of 2025, OpenAI's automated system flagged her account for "gun violence activity and planning."
A safety team reviewed the content and urged OpenAI management to notify the authorities, but the lawsuit alleges that the company's leadership chose instead to deactivate the account. They also failed to act, the lawsuit argues, when the shooter created a second account and continued her conversations with ChatGPT.
Last week, OpenAI CEO Sam Altman apologized to the community:
"I am deeply sorry that we did not alert law enforcement to the account that was banned in June," he wrote. "Going forward, our focus will continue to be on working with all levels of government to help ensure something like this never happens again."
In addition to allegedly failing to notify authorities of the imminent danger, the lawsuit claims that OpenAI knowingly rolled out a defective product to the public.
"The Tumbler Ridge attack was an entirely foreseeable result of deliberate design choices by OpenAI made with ful




Discussion (0)