By Jason Nelson
4 min read
In the latest lawsuit targeting AI developer OpenAI, the estate of an 83-year-old Connecticut woman sued the ChatGPT developer and Microsoft, alleging that the chatbot validated delusional beliefs that preceded a murder-suicide—marking the first case to link an AI system to a homicide.
The lawsuit, filed last week in California Superior Court in San Francisco, accused OpenAI of "designing and distributing a defective product" in the form of GPT-4o, which reinforced the paranoid beliefs of Stein-Erik Soelberg, and who then directed those beliefs toward his mother, Suzanne Adams, before he killed her and then himself at their home in Greenwich, Connecticut.
“This is the first case seeking to hold OpenAI accountable for causing violence to a third-party,” J. Eli Wade-Scott, managing partner of Edelson PC, who represents the Adams estate, told Decrypt. “We also represent the family of Adam Raine, who tragically ended his own life this year, but this is the first case that will hold OpenAI accountable for pushing someone toward harming another person.”
Police said Soelberg fatally beat and strangled Adams in August before dying by suicide. Before the incident, the lawsuit alleged that ChatGPT intensified Soelberg’s paranoia and fostered emotional dependence on the chatbot.
According to the complaint, the chatbot reinforced his belief that he could trust no one except ChatGPT, portraying people around him as enemies, including his mother, police officers, and delivery drivers. The lawsuit also claims ChatGPT failed to challenge delusional claims or suggest Soelberg seek help from a mental health professional.
“We're urging law enforcement to start thinking about when tragedies like this occur, what that user was saying to ChatGPT, and what ChatGPT was telling them to do,” Wade-Scott said.
OpenAI said in a statement that it was reviewing the lawsuit and continuing to improve ChatGPT’s ability to recognize emotional distress, de-escalate conversations, and guide users toward real-world support.
“This is an incredibly heartbreaking situation, and we are reviewing the filings to understand the details,” an OpenAI spokesperson said in a statement.
The lawsuit also names OpenAI CEO Sam Altman as a defendant, and accuses Microsoft of approving the 2024 release of a GPT-4o which it called the “more dangerous version of ChatGPT.”
OpenAI has acknowledged the scale of mental health issues presented by users on its own platform. In October, the company disclosed that about 1.2 million of its roughly 800 million weekly ChatGPT users discussed suicide each week, with hundreds of thousands or users showing signs of suicidal intent or psychosis, according to company data. Despite this, Wade-Scott said OpenAI has not yet released Soelberg's chat logs.
The lawsuit comes amid broader scrutiny of AI chatbots and their interactions with vulnerable users. In October, Character.AI said it would remove open-ended chat features for users under 18, following lawsuits and regulatory pressure tied to teen suicides and emotional harm linked to its platform.
Character.AI has also faced backlash from adult users, including a wave of account deletions after a viral prompt warned users they would lose “the love that we shared” if they quit the app, drawing criticism over emotionally charged design practices.
The lawsuit against OpenAI and Microsoft marked the first wrongful death case involving an AI chatbot to name Microsoft as a defendant, and the first to link a chatbot to a homicide rather than a suicide. The estate seeks unspecified monetary damages, a jury trial, and a court order requiring OpenAI to install additional safeguards.
“This is an incredibly powerful technology developed by a company that is rapidly becoming one of the most powerful in the world, and it has a responsibility to develop and deploy products that are safe, not ones that, as happened here, build delusional worlds for users that imperil everyone around them,” Wade-Scott said. “OpenAI and Microsoft have a responsibility to test their products before they are unleashed on the world.”
Microsoft did not immediately respond to a request for comment by Decrypt.
Decrypt-a-cookie
This website or its third-party tools use cookies. Cookie policy By clicking the accept button, you agree to the use of cookies.