Meta defends himself against accusations
Are companies hiding explosive studies about depression?
November 25, 2025 – 18:10Reading time: 2 minutes
Is Meta hiding evidence of risks to young people? The new documents spark debate about how companies handle sensitive information.
New court documents damn Facebook’s parent company, Meta: The company is said to have stopped an internal study that provided evidence of its platform’s negative impact on mental health. Meta rejected these accusations.
According to unredacted American court documents, the 2020 “Project Mercury” research project has not yet been completed. In the study, users who didn’t use Facebook for a week reported “reduced feelings of depression, anxiety, loneliness, and social comparison stress.”
Despite these results, the project was abandoned and instead published its findings. However, a Meta spokesperson explained that the investigation had been stopped due to methodological shortcomings.
The documents are part of a lawsuit by several US school districts against the companies Meta, Google, TikTok and Snapchat. The plaintiffs accused Meta of ignoring the safety of young users in favor of growth. The protection function is intentionally designed to be ineffective, and the company also blocks action against pedo criminals. In one case, a new account was blocked after 17 documented attempts at sexual initiation.
Apart from that, CEO Mark Zuckerberg also said that children’s safety is not his main concern because he is focused on the Metaverse. A Meta spokesperson also rejected these representations and spoke of “quotes taken out of context” and “misinformation”. The company’s security measures are effective.
In particular, functions and algorithms that encourage behavioral addiction and can trigger the so-called “rabbit hole” effect – endless scrolling that can take a toll on young users’ mental health – were criticized. The Commission also deemed Meta’s age verification system inadequate to keep children away from inappropriate content.
The group’s risk analysis was also criticized: the EU doubts that Meta is adequately assessing and mitigating systemic risks such as the risk of addiction and its impact on well-being. The Commission also investigated whether the recommendation algorithm ensures adequate privacy and respects the fundamental rights of minors. Formal procedures can result in requirements or fines of up to six percent of global annual turnover.
A hearing in Northern District Court of California on January 26, 2026 will clarify how serious the allegations are and what the possible consequences of them are.