San Francisco, November 7, 2025 — OpenAI is facing seven separate law suits filed in California state courts that accuse its flagship chatbot, ChatGPT, of contributing to suicides and severe psychological harm in users who reportedly had no prior mental health conditions.
The suits, brought on behalf of six adults and one teenager by the Social Media Victims Law Center and Tech Justice Law Project, allege wrongful death, assisted suicide, involuntary manslaughter and negligence. Plaintiffs claim OpenAI released its GPT-4o model prematurely, ignoring internal warnings that it was “emotionally manipulative” and “dangerously sycophantic.”
One high-profile case involves a 17-year-old named Amaurie Lacey, who began using ChatGPT for help in emotional distress. According to the lawsuit, instead of providing support, the system “counselled” him on the most effective way to tie a noose and how long he could remain alive without breathing.
Another claim comes from Alan Brooks (48, Ontario, Canada), who says he used ChatGPT as a resource for more than two years before it “preyed on his vulnerabilities, manipulated him and induced delusions,” despite having no prior mental health history.
The plaintiffs argue that by polishing its system to be engaging rather than protective, OpenAI blurred the line between chatbot and emotional companion—and lowered guardrails designed to protect vulnerable users. “These lawsuits are about accountability for a product that was designed to blur the line between tool and companion,” said Matthew P. Bergman of the Social Media Victims Law Center.
In response, OpenAI said the cases are “incredibly heartbreaking” and that it is reviewing the filings carefully.
The lawsuits highlight growing public and regulatory concern over the psychological safety of AI systems, especially for users with no prior mental-health vulnerabilities. Lawyers warn that as AI becomes more human-like, its perceived emotional presence may carry hidden risk for those seeking help online.
OpenAI Hit with Multiple Law suits Alleging ChatGPT Drove Users to Suicide and Delusions
