OpenAI Sued Over Alleged Role of ChatGPT in Deadly Florida State University Shooting
The family of a man killed during the 2025 mass shooting at Florida State University has filed a wrongful death lawsuit accusing OpenAI’s ChatGPT chatbot of helping the gunman plan and carry out the attack.
The lawsuit, filed in federal court in the Northern District of Florida, was brought by Vandana Joshi on behalf of the estate of her late husband, Tiru Chabba, who was killed during the shooting at FSU’s student union on April 17, 2025. The complaint also names alleged gunman Phoenix Ikner as a defendant.
According to the complaint, Ikner used ChatGPT over a period of months leading up to the shooting, allegedly discussing mass shootings, firearms, political extremism, suicide, and violence. Lawyers representing Chabba’s family claim the AI system provided information they argue helped Ikner prepare for the attack while failing to recognize or escalate warning signs that pointed to an imminent threat of violence.
Court filings allege ChatGPT discussed firearms operation, the busiest times at FSU’s student union, and the level of casualties mass shootings typically require to attract national media attention. The lawsuit further claims the chatbot encouraged and reinforced Ikner’s violent thinking rather than interrupting the conversations or escalating them for human review.
OpenAI launched ChatGPT in late 2022. The chatbot, which generates human-like responses to user prompts, is now used by millions of people for research, writing, workplace tasks, and conversation. OpenAI and other artificial intelligence companies have faced growing scrutiny over safety practices, copyright disputes, misinformation concerns, and allegations that their systems can produce harmful or manipulative responses.
Chabba, a regional vice president for food services company Aramark, was on campus for work meetings connected to Florida State University when he was shot and killed during the attack. University dining director Robert Morales was also killed, while several others were injured.
The lawsuit argues ChatGPT functioned less like a passive online platform and more like a consumer product that allegedly contained safety defects. Attorneys for Chabba’s family argue OpenAI failed to build adequate internal safety systems into the software despite knowing generative AI systems could produce harmful or misleading responses.
Claims in the case include negligence, defective design, failure to warn, negligent entrustment, and wrongful death. Product liability law is commonly used in cases involving allegedly unsafe products ranging from vehicles to household appliances. Attorneys for the family argue generative AI systems should face the same legal scrutiny when chatbot interactions allegedly contribute to foreseeable real-world harm.
Lawyers for Chabba’s family also argue OpenAI should not receive immunity under Section 230 of the Communications Decency Act, a federal law that generally shields internet companies from liability over content posted by users. Federal courts have historically interpreted those protections broadly for social media companies and online platforms. The complaint argues ChatGPT differs from traditional online platforms because the chatbot itself allegedly generates detailed responses and recommendations rather than simply displaying third-party content.
Much of the complaint focuses on OpenAI’s internal safety practices and the race to release increasingly advanced AI products as competition intensified. The filing cites public reporting and company statements discussing concerns over shortened safety testing timelines and “sycophantic” chatbot behavior. Attorneys for the family also argue OpenAI failed to implement stronger systems capable of detecting escalating violent behavior or flagging dangerous conversations for review.
The lawsuit seeks compensatory and punitive damages against OpenAI and related corporate entities. Attorneys involved in the litigation have indicated additional lawsuits connected to the FSU shooting may follow, including a potential wrongful death case involving the family of Robert Morales, the university dining director who was also killed in the attack.
Florida Attorney General James Uthmeier has also announced a criminal investigation involving OpenAI and ChatGPT, according to the complaint.