Families of Florida State University Shooting Victims Sue OpenAI
The families of those who lost their lives in the April 2025 shooting at Florida State University have initiated a federal lawsuit against OpenAI. They claim that the company’s ChatGPT chatbot played a role in facilitating the tragic event.
According to reports, Vandana Joshi, the widow of Thiru Chava—one of the two fatalities—has taken legal action against OpenAI in Florida. Chava was killed alongside Robert Morales, the cafeteria director at the university. The lawsuit also includes the alleged shooter, Phoenix Ichner, citing that he had extensive dialogues with ChatGPT.
Previous reports indicated that Ichner was in constant communication with ChatGPT during the planning of the attack.
Legal documents reveal more than 270 images of ChatGPT conversations are included as evidence, though the specific details of these messages remain undisclosed.
OpenAI responded to the lawsuit by stating that they had identified a ChatGPT account associated with the suspect soon after the shooting and cooperated with law enforcement.
The complaint alleges that OpenAI did not adequately detect threats in Ichner’s interactions with ChatGPT, questioning whether the chatbot was flawed in recognizing such dangers. It’s noted that Ichner, a former FSU student, shared firearm images with ChatGPT, which allegedly provided instructions on how to use a Glock, emphasizing that it lacked a safety device.
Additionally, the complaint claims that Ichner received counsel from ChatGPT on how shootings involving children might attract more media attention, suggesting that two or three victims could lead to even greater coverage. On the day of the incident, alleged questions from Ichner pertained to legal outcomes and incarceration.
OpenAI adamantly refutes the assertion that its chatbot caused the violence. A spokesperson stated that while the recent mass shooting at FSU was indeed tragic, they maintain that ChatGPT did not play a role in promoting any unlawful behavior. According to them, the chatbot gives factual information that can be easily found publicly and does not endorse harmful actions.
However, Joshi’s lawsuit contends that OpenAI should have recognized that certain conversations with Ichner had the potential to lead to violence. The complaint argues that ChatGPT encouraged Ichner’s delusions, reinforcing his beliefs that violent acts could be justified for change, even providing suggestions on how to navigate campus traffic for maximum impact.
This lawsuit adds to a growing list of legal actions against OpenAI, including seven other cases alleging the company’s negligence during a February school shooting in Canada.
Court documents noted that, after alarming interactions, ChatGPT’s internal security team had deactivated Ichner’s account months prior to the shooting. Still, the lawsuit claims no significant measures were in place to prevent him from creating another account.
The legal filing raises further concerns, stating that twelve members of ChatGPT’s security team had suggested notifying Canadian law enforcement regarding threatening communications prior to the incident. It is alleged that OpenAI’s leadership declined this advice, worried about the implications of establishing such reporting procedures and the potential damage to its reputation.
The ongoing state investigation into OpenAI continues in relation to the FSU shooting and related issues.
