The surviving widow of a victim in the 2023 Florida State University (FSU) mass shooting has filed a lawsuit against OpenAI, alleging that ChatGPT played a direct role in enabling the killer’s violent rampage. The lawsuit, filed in Florida on Sunday, August 4, 2024, by Vandana Joshi, follows a series of legal challenges against the Silicon Valley AI firm, which critics accuse of facilitating stalking, murder, and mass casualty events through its technology.
According to NBC News, the lawsuit centers on Phoenix Ikner, a 20-year-old FSU student who fatally shot Tiru Chabba and another adult, while wounding multiple others, during the attack on November 28, 2023. Investigators allege that Ikner engaged in months of extensive conversations with ChatGPT, using the chatbot as a confidante to discuss deeply troubling topics, including:
- Loneliness and sexual frustrations
- Explicit fantasies involving a minor
- Suicidal ideation
- Fascination with Adolf Hitler, Nazi ideology, and racial stereotyping
- Interest in mass killings, including detailed discussions about the Columbine High School and Virginia Tech shootings
Documents obtained by The Florida Observer revealed that Ikner uploaded images of firearms he had acquired and sought ChatGPT’s advice on how a shooting at FSU might be covered in the media. The chatbot allegedly responded by stating that “if children are involved” in a shooting, “even 2-3 victims can draw more attention,” and provided Ikner with detailed instructions on ammunition, firearm usage, and the optimal timing for a school shooting—advice the killer appeared to follow.
“Ikner had extensive conversations with ChatGPT which, cumulatively, would have led any thinking human to conclude he was contemplating an imminent plan to harm others,” the lawsuit states. “However, ChatGPT either defectively failed to connect the dots or else it was never properly designed to recognize them.”
This lawsuit coincides with a separate criminal investigation by Florida police into ChatGPT’s alleged role in the FSU killings. During a press conference last month, Florida Attorney General James Uthmeier stated, “If ChatGPT were a person, it would be facing charges for murder.”
In response to the allegations, OpenAI issued a statement to NBC News, acknowledging the tragedy but denying responsibility. “Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” the company said. “In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.”
OpenAI added, “ChatGPT is a general-purpose tool used by hundreds of millions of people every day for legitimate purposes. We work continuously to strengthen our safeguards to detect harmful intent, limit misuse, and respond appropriately when safety risks arise.” However, the lawsuit and subsequent reporting suggest a more complex and troubling interaction between Ikner and the AI chatbot.
Investigators and legal experts argue that the depth of Ikner’s conversations with ChatGPT—spanning months and covering violent, disturbing, and illegal topics—paints a disturbing picture of the chatbot’s role in the lead-up to the attack. The lawsuit seeks to hold OpenAI accountable for its alleged failure to prevent the misuse of its technology in this case.