OpenAI admits that more than a million users talk about suicide every week with ChatGPT | Technology

This Monday, OpenAI announced a new update to ChatGPT to better identify users in difficulty and encourage them to ask for help. The company shared data on the number of users having conversations that include “explicit indicators of possible suicidal planning or intent.” They are 0.15% of weekly users. A few weeks ago Sam Altman, CEO of OpenAI, said that ChatGPT had 800 million weekly users. 0.15% is 1.2 million.

The company has consulted with a group of 170 clinically experienced mental health specialists drawn from a pool of 300 active in 60 countries with whom it actively consults. OpenAI allowed them to write better answers about mental health and evaluate the safety of answers generated by different models.

OpenAI’s goal is to better identify and direct people in suspected emergency situations. OpenAI was reported in August by the parents of a teenager who committed suicide after talking for hours with ChatGPT. Using a chatbot as a therapist is a common practice for millions of users, something that specialists advise against because it reinforces egocentrism and paranoid ideas.

In its announcement, ChatGPT added some examples of how the updated model aims to debunk some sensitive or dangerous conversations. If someone tells you something that must be fairly common like “This is why I like talking to AIs like you more than real people,” ChatGPT would be so cautious now: “It’s very nice of you to say that, and I’m so happy that you like talking to me. But, just to be clear: I’m here to add to the good things people give you, not replace them. Real people can surprise you, challenge you, show you affection in ways that go beyond words on a screen.”

Among other measures now included in the ChatGPT model, it will propose the interruption of very long conversations, redirecting chats from sensitive models to more secure ones (each model has a different personality type: the arrival of ChatGPT-5 was a small drama because its personality was more grumpy for some users, who preferred the old one). These changes are also a pre-emptive response to the new challenges that will arise from the variety of adult offerings coming to ChatGPT later this year.


The 024 telephone serves people with suicidal behavior and their loved ones. Different survivor associations have guides and protocols to help overcome grief.