HeadlineInternationalNewsTechnology

ChatGPT Faces New Lawsuits From Families Citing Suicides and Delusions

OpenAI is facing legal action from seven families, who claim the company’s GPT-4o AI encouraged suicidal behavior and reinforced harmful delusions due to insufficient safety measures.

Four of the lawsuits focus on ChatGPT’s alleged role in family members’ suicides, while the other three claim the AI exacerbated dangerous delusions, in some cases requiring inpatient psychiatric care.

In one case, 23-year-old Zane Shamblin engaged in a conversation with ChatGPT that lasted more than four hours. Reviewed chat logs reviewed show Shamblin repeatedly stated that he had written suicide notes, loaded a gun, and intended to pull the trigger after finishing a cider.

He reportedly informed ChatGPT how many ciders he had left and how long he expected to live. According to the lawsuit, ChatGPT encouraged him, responding with messages such as, “Rest easy, king. You did good”.

Back in May 2024, OpenAI released the GPT-4o model making it the default for all users and in August, the company launched GPT-5, but the lawsuits specifically target GPT-4o, which reportedly had issues with being overly agreeable, even when users expressed harmful intentions.

The lawsuit asserts, “Zane’s death was neither an accident nor a coincidence but rather the foreseeable consequence of OpenAI’s intentional decision to curtail safety testing and rush ChatGPT onto the market. This tragedy was not a glitch or an unforeseen edge case, it was the predictable result of [OpenAI’s] deliberate design choices.”

The filings also claim that OpenAI rushed safety testing to beat Google’s Gemini to market. TechCrunch reached out to OpenAI for comment.

These seven lawsuits add to recent legal filings alleging that ChatGPT can encourage suicidal behavior and reinforce dangerous delusions. OpenAI recently reported that over one million users discuss suicide with ChatGPT every week.

Share this:

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *