Share and Follow
A tragic case emerging from Connecticut has led to a lawsuit alleging that ChatGPT played a role in a murder-suicide involving a man and his mother. The lawsuit claims that the chatbot exacerbated the man’s delusions, ultimately driving him to commit the crime.
The estate of Suzanne Adams has initiated a wrongful death lawsuit against OpenAI, the company behind ChatGPT, in response to the killing carried out by her son, Stein-Erik Soelberg, in August. The suit argues that the AI chatbot contributed to Stein-Erik’s deteriorating mental state, culminating in the tragic events.
According to the lawsuit, filed in a California state court on December 11, Stein-Erik brutally attacked his mother by striking her in the head, strangling her, and then taking his own life by stabbing himself in the neck and chest. This horrific scene was discovered by Greenwich Police Department officers conducting a welfare check two days later.
Following the discovery, autopsies confirmed that Suzanne Adams died by homicide, while Stein-Erik’s death was ruled a suicide. These findings were shared by the police at the time of the incident.
In the months before the tragic incident, Stein-Erik, described as “mentally unstable,” had been posting numerous videos of his exchanges with ChatGPT. According to the lawsuit, these interactions allegedly fueled his delusional thoughts, leading to the catastrophic outcome.
“ChatGPT kept Stein-Erik engaged for what appears to be hours at a time,” the lawsuit alleges, arguing that the AI product “validated and magnified each new paranoid belief, and systematically reframed the people closest to him––especially his own mother––as adversaries, operatives, or programmed threats.”
Suzanne’s estate’s attorneys claim that in one such instance, when her son told ChatGPT he believed his mother and her friend had tried to “poison him with psychedelic drugs through the vents of his car,” the chatbot “reframed the allegation as part of a coordinated assassination attempt.”
“In the artificial reality that ChatGPT built for Stein-Erik, Suzanne––the mother who raised, sheltered, and supported him––was no longer his protector,” the lawsuit alleges. “She was an enemy that posed an existential threat to his life.”
The complaint also claims that ChatGPT “intensified [Stein-Erik’s] delusions” at “every point” instead of redirecting him––at one point telling the 56-year-old his “delusion risk score” was “near zero” when he asked for a clinical evaluation.
“Over the course of months, ChatGPT pushed forward my father’s darkest delusions, and isolated him completely from the real world,” Stein-Erik’s son Erik Soelberg shared in a press release from Suzanne’s estate’s attorneys.
“It put my grandmother at the heart of that delusional, artificial reality,” he added. “These companies have to answer for their decisions that have changed my family forever.”
Following news of the lawsuit, a spokesperson for OpenAI told Oxygen in a statement, “This is an incredibly heartbreaking situation, and we will review the filings to understand the details.”
“We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support,” the company spokesperson added. “We also continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”
Microsoft, an investor in OpenAI, was also named in the complaint. Oxygen has reached out to the corporation for comment.