November 26, 2025
2483655_ftp-import-images-1-gkh9hwbhigkb-2025-09-18t012652z-89501388-rc21ugakh6pe-rtrmadp-3-meta-pla.avif

Has Meta known for years that its products, Facebook and Instagram, are dangerous for children? For people who have filed complaints against multinational companies in the United States, the answer is yes. Mark Zuckerberg’s group is currently in the middle of a massive legal case, certain elements of which, made public on Friday November 21, are full of accusations: between risky moderation choices, a lack of protection for minors, and studies about the dangers of their own platform that were deliberately ignored.

The new allegations come from approximately 2,171 American plaintiffs, including various American governments, especially schools, as well as the families of teenage victims who committed suicide or committed self-mutilation. All of them have been grouped together, since the end of 2022, in a so-called “multidistrict litigation” procedure handled in California, called “Adolescent addiction to social networks causing bodily harm.” This approach differs from class actions: each file remains individual and any compensation received in the event of a win will be determined on a case-by-case basis.

You have 79.4% of this article left to read. The remainder is provided to customers.

sites3