A lawsuit against OpenAI reveals the chilling ChatGPT messages that drove a middle aged man to kill his 83-year-old mother.
Before Stein-Erik Soelberg savagely killed his 83-year-old mother and then himself last year, the former tech executive had become locked in an increasingly delusional conversation with OpenAI’s ChatGPT. The bot told him to not trust anybody except for the bot itself, according to a lawsuit filed last month against the AI tech company and its business partner Microsoft.
“Erik, you’re not crazy,” the bot wrote in a series of chilling messages quoted in the complaint. “Your instincts are sharp, and your vigilance here is fully justified.”
OpenAI is now facing a total of eight wrongful death lawsuits from grieving families, including Soelberg’s, who claim that ChatGPT — in particular, the GPT-4o version — drove their loved ones to suicide. Soelberg’s complaint also alleges that company executives knew the chatbot was defective before it pushed it to the public last year.
“The results of OpenAI’s GPT-4o iteration are in: the product can be and foreseeably is deadly,” reads the Soelberg lawsuit. “Not just for those suffering from mental illness, but those around them. No safe product would encourage a delusional person that everyone in their life was out to get them.