Brussels warns TikTok, Facebook and Instagram that they put minors at risk by exposing them to harmful content | Technology

The European Commission has issued a very serious warning to TikTok, Facebook and Instagram: in a preliminary conclusion, the European Executive states that it has found that none of these popular digital platforms adequately guarantee the mechanisms required by European legislation to investigate and report illegal and potentially harmful content, especially for minors, such as sexual abuse or terrorist material. In the case of the Meta, Facebook and Instagram services, Brussels also criticizes the difficulty for users to challenge a decision regarding content moderation, including a potential suspension of their account.

Since coming into force just over two years ago, the EU Digital Services Act (DSA) aims to better protect consumers and their fundamental rights online by combating illegal content and demanding greater transparency from large platforms. Since then, Brussels has opened several investigations into various platforms for alleged violation of the conditions of this regulation, much criticized by Donald Trump’s government, which went so far as to define it as censorship. An extreme that Brussels categorically denies, explaining that what it seeks is to preserve users’ freedom of expression in the face of the “unilateral” moderation decisions of these platforms, controlled by large technology companies, mostly American.

The case that has now led the Commission to launch a serious call to attention on TikTok, Facebook and Instagram does not in itself concern the illegal content that these platforms can distribute, but the mechanisms they have in place so that users can warn them if they come across material of this type: it deems them too opaque to guarantee their effectiveness.

According to the investigation, the three platforms have procedures and tools that are so “cumbersome” that they do not allow researchers “adequate access” to public data. Something which, Brussels denounces, “leaves researchers with partial or unreliable data”. And this is not trivial: “Allowing researchers access to platform data is an essential transparency obligation under the DSA, as it facilitates public scrutiny of the potential impact of platforms on our physical and mental health,” the Commission recalls in a note.

In the case of Facebook and Instagram, both belonging to Meta, European researchers also believe that the company does not comply with its obligation to ensure “simple mechanisms for reporting illegal content”. The systems in place are “confusing” and “dissuasive,” because Meta imposes “several unnecessary steps and additional demands on users” that complicate a process that it should make easier.

Furthermore, Brussels believes that on its platforms Meta is also violating the right of European users, guaranteed by the DSA, to challenge a content moderation decision, for example the reasons why an account has been suspended. Specifically, these platforms’ appeal mechanisms “do not appear to allow users to provide explanations or evidence to support their appeals,” the Commission notes. A situation that “makes it difficult for EU users to explain in more detail why they disagree with Meta’s decision on content, limiting the effectiveness of the appeal mechanism”.

The research is not yet definitive and the aforementioned platforms now have the possibility to replicate and provide their own data. However, the step taken by Brussels this Friday is much more than a wake-up call: if the European Executive publishes the first conclusions it is because it believes it has sufficient evidence of an infringement, based on “hundreds of complaints” received, according to community sources. The investigations were opened in February (TikTok) and April (Meta) 2024.

“Our democracies depend on trust. This means that platforms must empower users, respect their rights and open their systems to scrutiny,” reminds Henna Virkkunen, Commissioner for Technological Sovereignty. The aim of this investigation is “to ensure that platforms are accountable for their services, as guaranteed by EU legislation, towards users and society,” the statement adds.

Meta was quick to send an initial response: “We do not agree with any claims that we have violated the DSA and continue to negotiate with the European Commission on these issues,” the company said in a statement sent to this newspaper. According to Meta, adequate work has been done to adapt to European standards: “Since the DSA came into force, we have made changes to our content notification options, appeals process and data access tools and we are confident that these solutions comply with what is required by European legislation,” he says.

If the platforms fail to convince the Commission and it ends up concluding, at a date not yet set, but in any case not known until well into 2026, that its preliminary conclusions were correct, it could issue a “non-compliance decision”. Which, Brussels reminds us, can lead to a fine of up to 6% of the supplier’s total annual worldwide turnover. Furthermore, the European Executive can also impose periodic sanctions to force a platform to comply with regulations.