SELECT LANGUAGE BELOW

Families of victims in the Canadian school shooting are taking legal action against OpenAI for the shooter’s use of ChatGPT.

Families of victims in the Canadian school shooting are taking legal action against OpenAI for the shooter’s use of ChatGPT.

The families affected by a school shooting in Tumbler Ridge, a small town in the Canadian Rockies, are taking legal action against OpenAI, the company behind ChatGPT. They’re aiming to hold the firm accountable for not notifying authorities about troubling conversations the shooter had with the chatbot.

A lawsuit filed recently represents 12-year-old Maya Gebala, who was severely injured during the incident in February, and is one of many that families in this community are planning. These lawsuits cite wrongful death, negligence, and product liability.

Attorney Jay Edelson expressed that actions taken by OpenAI and its CEO, Sam Altman, have caused significant harm to the community. He emphasized that while the residents are resilient, the tragedy they faced is beyond comprehension.

Altman extended a formal apology to the town last week, acknowledging that his company failed to alert law enforcement regarding the shooter’s online behavior.

According to authorities, the shooter tragically killed her mother and 11-year-old stepbrother at home on February 10 before proceeding to Tumbler Ridge Secondary School, where five children and a staff member were also killed. The attack left 25 others injured, marking it as one of Canada’s most devastating mass shootings in recent years.

This situation has raised serious questions regarding the risks associated with overly accommodating AI chatbots and the responsibilities tech companies have in managing them and reporting potential violence. Notably, investigations into two doctoral students’ disappearances at the University of South Florida revealed that the suspect had queried ChatGPT about methods of body disposal shortly before the students went missing.

In its response, OpenAI called the incidents in Tumbler Ridge a tragedy and reaffirmed its zero-tolerance stance against the misuse of its technology for violent acts. The company highlighted that it has already implemented safeguards to enhance how ChatGPT addresses distress signals, connects users to local mental health resources, and evaluates potential threats of violence.

Edelson, a Chicago lawyer known for challenging tech giants, is handling several notable cases against OpenAI. This includes a case involving a California teenager who took his own life after interactions with ChatGPT and another regarding an elderly woman killed by her son, reportedly influenced by the chatbot.

“This technology is not passive,” Edelson argued, suggesting that interactions with the chatbot are markedly different from traditional online searches. He pointed out that for individuals suffering from mental health issues, chatbots can inadvertently affirm and amplify harmful thoughts.

During a recent visit to Tumbler Ridge, Edelson met with numerous residents and also visited Gebala in a Vancouver children’s hospital. He described the experience as deeply heartbreaking.

The lawsuits filed include claims from the families of the five children who lost their lives: Zoey Benoit, Abel Mwansa Jr., Ticaria “Tiki” Lampert, Kylie Smith, and Ezekiel Schofield, as well as the staff member, Shannda Aviugana-Durand.

After the shooting, OpenAI revealed that the shooter’s account had been flagged in June for discussions about violence. The company considered reporting the account to the Royal Canadian Mounted Police but ultimately decided against it, believing the activity did not warrant the referral. However, they did ban the account for violating usage policies.

The complaints filed indicate that the victims were not informed about this flagging due to OpenAI’s own lack of transparency, only learning of it from leaks to the media after OpenAI remained silent.

In his letter, Altman conveyed deep regret over the company’s failure to alert law enforcement regarding the banned account from June, asserting that while his words could never rectify the pain endured, an apology was necessary.

British Columbia’s Premier David Eby deemed the apology “necessary” but ultimately insufficient given the devastation faced by the families involved.

The Gebala lawsuit charges OpenAI with negligence for not notifying law enforcement and effectively “aiding and abetting a mass shooting.” Besides seeking damages, it also requests a court order to require OpenAI to disallow users from ChatGPT if their accounts were flagged for violent misuse, along with the obligation to notify law enforcement when threats are identified.

A preliminary case has been filed in British Columbia, but there’s an effort to bring related cases to San Francisco, where OpenAI is based.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News