ChatGPT to Add Guardrails for Emotionally Distressed

OpenAI announced Tuesday that the company will add guardrails for teenage users and people in emotional distress after the family of a teenager who died by suicide filed a lawsuit against the company.

The parents of 16-year-old Adam Raine from California filed a lawsuit against OpenAI alleging that “ChatGPT actively helped Adam explore suicide methods” and allowed him to bypass safeguards by telling ChatGPT that he was writing a story or “practicing.”

“Our work to make ChatGPT as helpful as possible is constant and ongoing. We’ve seen people turn to it in the most difficult of moments. That’s why we continue to improve how our models recognize and respond to signs of mental and emotional distress, guided by expert input,” OpenAI wrote in a blog post on Tuesday.

Read more at Newsmax© 2025 Newsmax. All rights reserved.