3 min read
Vandana Joshi, whose husband was killed in the April 2025 Florida State University mass shooting, filed a federal lawsuit against OpenAI Sunday, alleging that ChatGPT enabled the attack by providing firearms guidance and tactical advice to the shooter.
The lawsuit alleges Phoenix Ikner shared images of firearms with ChatGPT and received instructions on how to use them in the weeks before April 17, 2025. According to the filing, ChatGPT allegedly told Ikner that weekday lunchtimes between 11:30 a.m. and 1:30 p.m. were peak hours at the student union—and Ikner began his attack at 11:57 a.m.
ChatGPT also allegedly claimed that a shooting was more likely to gain national attention “if children are involved,” adding that, “even 2-3 victims can draw more attention.” Per the complaint, Ikner also shared images of firearms he had acquired with ChatGPT, with the chatbot responding with firing techniques for a Glock handgun, including "advising him to keep his finger off the trigger until he was ready to shoot."
"In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity," OpenAI spokesperson Drew Pusateri told NBC News, denying the allegations.
Joshi’s complaint alleges that where “any thinking human” would have concluded that Ikner’s conversations pointed to an “imminent plan to harm others,” the chatbot “defectively failed to connect the dots or else it was never properly designed to recognize the threat.”
The lawsuit adds to legal pressure on OpenAI, with Florida Attorney General James Uthmeier last month launching a criminal investigation into the firm and its ChatGPT product. The chatbot “advised the shooter on what type of gun to use” and on types of ammunition, Uthmeier said, adding that, “if ChatGPT were a person, it would be facing charges for murder.”
The Florida Office of Statewide Prosecution subpoenaed OpenAI for information and records including policies on user threats and cooperation with law enforcement.
The case stems from a mass shooting at Florida State University in April 2025, in which Phoenix Ikner, a former FSU student, allegedly killed two people and injured six others. Ikner faces charges of murder and attempted murder in connection with the attack.
The incident has drawn scrutiny over the role AI systems may play in facilitating real-world violence. While AI companies have typically avoided liability for user-generated content, this lawsuit seeks to establish a new precedent for holding them accountable when their systems allegedly provide guidance for criminal acts.
This isn’t the first such lawsuit targeting OpenAI. In April, seven families of Canadian mass shooting victims sued OpenAI and CEO Sam Altman in U.S. court. At the time, Jay Edelson, the attorney representing the Canadian families, said he planned to file another two dozen lawsuits in the following weeks against the company on behalf of other people affected by the shooting.
Decrypt-a-cookie
This website or its third-party tools use cookies. Cookie policy By clicking the accept button, you agree to the use of cookies.