Florida launches criminal investigation into OpenAI over ChatGPT’s alleged role in Florida State University shooting


Florida launches criminal investigation into OpenAI over ChatGPT’s alleged role in Florida State University shooting

Attorney General James Uthmeier said prosecutors reviewed chat logs showing ChatGPT advised the suspect on weapons, ammunition, and timing. The probe is the first criminal investigation into an AI company over an alleged role in a mass shooting in the US.


Florida Attorney General James Uthmeier announced on Tuesday that the state’s Office of Statewide Prosecution has opened a criminal investigation into OpenAI over the alleged role of its ChatGPT chatbot in the April 2025 mass shooting at Florida State University.

The shooting, which killed two people and injured six others near the student union on FSU’s Tallahassee campus, was carried out by Phoenix Ikner, 21, a student at the university at the time. His trial is set to begin on 19 October 2026. More than 200 AI messages have been entered into evidence in the case.

Uthmeier said an initial review of Ikner’s ChatGPT chat logs showed the suspect had used the tool to seek advice before carrying out the attack, including what type of gun to use, what ammunition was appropriate, what time of day to go to campus to encounter more people, and which locations on campus would have a higher population.

“My prosecutors have looked at this and they’ve told me, if it was a person on the other end of that screen, we would be charging them with murder,” Uthmeier said at a press conference in Tampa.

“ChatGPT offered significant advice to the shooter before he committed such heinous crimes. We cannot have AI bots that are advising people on how to kill others.”

OpenAI has been subpoenaed for information about its policies and internal training materials regarding user threats of harm to others and self-harm, as well as its policies for reporting possible crimes.

The company’s spokesperson Kate Waters said: “Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime.”

OpenAI said it had proactively shared information about the alleged shooter’s account with law enforcement after the shooting and continues to cooperate with authorities. The company has maintained that ChatGPT provided only general, factual responses based on widely available information.

A criminal investigation into an AI company over an alleged role in a mass shooting is, as multiple legal experts have noted, unprecedented in the United States.

Uthmeier had already announced a civil investigation into ChatGPT’s role in the FSU shooting, which is ongoing. Attorneys representing the family of one of the victims have announced plans to sue OpenAI.

The criminal probe is a significant escalation: it opens the question of whether an AI company could be held criminally liable for responses its system generates, a question with no established legal precedent under current US law.

The Florida investigation is part of a broader pattern of legal pressure on AI chatbot companies over alleged contributions to violent incidents.

OpenAI is already facing a lawsuit from the family of a victim critically injured in a mass shooting in British Columbia in February 2026 that killed eight people and injured dozens more, an 18-year-old alleged gunman who had discussed gun violence scenarios with ChatGPT and was banned from the platform months before the shooting, but reportedly evaded detection and created another account.

OpenAI said it had identified and banned the user but did not alert law enforcement at the time. Separately, a wrongful death lawsuit filed against Google in March over the suicide of a Florida man alleges that its Gemini chatbot pushed the man toward planning a mass casualty attack.

OpenAI has said it is working with mental health experts to improve how ChatGPT responds to signs of mental or emotional distress, and that it has taken steps to strengthen its safeguards after the British Columbia case, including changing when it chooses to alert law enforcement about potentially violent activities.

Get the TNW newsletter

Get the most important tech news in your inbox each week.