Roblox is played every day by 152 million people around the world. A noteworthy detail: almost 40% of users of this video game platform are under 13 years old and enjoy interactive environments such as Brookhaven or Grow a Garden. But they also share this virtual playground with adults who are not required to verify their identity to participate. Children who have been taught not to talk to strangers on the street play and chat on Roblox with adults from anywhere in the world without even knowing it.
As the platform’s popularity has grown in recent years, so have its problems: adults using it to molest minors, peer bullying, exposure to violent content, and cases of addiction or scams involving its virtual currency, Robux. The company has responded with measures such as facial recognition for users, but continues to face criticism from parents, doctors and prosecutors, who accuse it of not doing enough to protect children.
The platform has been in the spotlight since August, when it deleted the accounts of several “vigilantes” in the United States, volunteers who patrolled games to spot potential harassment and alert authorities to security gaps. Roblox justified the blocks, saying the accounts “began impersonating children and actively sought to connect with adult users,” which violates its terms of service.
Amid the controversy, the company has been sued by several U.S. attorneys, including in Texas, where it is based. Texas Attorney General Ken Paxton accused Roblox of “profiting from a digital playground that hides predators and manipulative psychological design behind a façade of ‘family fun.'” This is in addition to other charges in Florida, Louisiana and Kentucky, as well as private lawsuits. A Dallas family is suing Roblox for wrongful death following their son’s suicide, alleging that an adult convinced him to disable the game’s parental controls and move conversations to Discord, another gaming platform, where he later had intimate photos and videos extorted from him.
On Tuesday, Roblox announced a new age verification feature that uses the camera and facial recognition to estimate a user’s age to access communication features. Once assigned to an age group, users will only be able to chat with participants of a similar age: for example, a 12-year-old can only talk to users who are 15 years old or younger. The company opened a voluntary enrollment period and said the feature will become mandatory in early December in select markets such as Australia, New Zealand and the Netherlands, and worldwide in January.
Police investigations
The increase in requests about Roblox gave rise to a report from the chief inspector of the Child Protection Unit of the Central Cybercrime Division of the Spanish National Police. The report notes that between 2019 and September 2025, the unit conducted 24 investigations related to sexual abuse, child prostitution, solicitation of suicide, and scams involving minors on Roblox, resulting in six arrests. Three investigations are still ongoing.
The inspector, who prefers to remain anonymous, clarified that these are not official statistics, but data to evaluate the evolution of the phenomenon. Roblox is not the platform with the most reports, but he declined to name the others to avoid alerting the suspects: “There is an increase in complaints on almost all platforms, which is logical because this modus operandi is facilitated by messaging apps where minors can share audio, video, images and chat with people they don’t know.”
Roblox is just one of these channels. Police receive reports both from private individuals and from NCMEC, the international organization to which U.S.-based companies are required to report suspicious behavior.
Experts say the biggest risk to children lies in the chat function, which does not allow photo sharing and is monitored by artificial intelligence and a team of staff. However, some adults circumvent these safeguards by using coded language to communicate or persuade children to continue conversations and exchange images through other, less regulated messaging apps.
Pediatricians are sounding the alarm
The Spanish Society of Adolescent Medicine (SEMA) and the Health Promotion Committee of the Spanish Association of Pediatrics (AEP) issued a warning in September after noticing an increase in children and adolescents presenting “worrying symptoms” linked to the misuse of Roblox.
These symptoms include self-harm, significant behavioral changes, anxiety, insomnia, social withdrawal, and unusual comments or behaviors regarding sexuality, identity, or violence. They also highlight the potential for addiction caused by the platform’s points system.
María Angustias Salmerón, pediatrician and member of the AEP’s Health Promotion Committee, explains: “We are seeing many cases and, considering that this is exactly the type of iceberg, if there are consultations about it, it is because there is a serious problem.” He notes that his colleagues in different parts of Latin America have observed similar diagnoses.
The doctor recommends that parents “be proactive in finding out where their children play online, ask them if they have had experiences that have made them feel uncomfortable, and, if they detect even the slightest problem, take them to the pediatrician for an evaluation.”
Many parents are unaware of Roblox parental controls. According to Salmerón, they must create their own account linked to their child’s to manage chat, block users or experiences, limit purchases and screen time, and track which games their children spend the most time on.
When contacted about the issue, Roblox responded via email: “Every day we see an average of 6 billion chat messages and 1.1 million hours of voice communication in dozens of languages, the vast majority of which are daily conversations, but a small number of malicious actors attempt to circumvent our system.” The company argued that “bad actors adapt to evade detection” and recommended “ensuring that children have accounts with the correct information, such as their date of birth, so that they have additional protections enabled by default and see age-appropriate content.”
Antonio Planells, a TecnoCampus professor at Pompeu Fabra University in Spain, says Roblox “found that a lot of its rapid growth was due to being a social network with a very user-friendly aesthetic, where minors felt very comfortable playing, and it didn’t know how to handle that.” For the professor, the key “is not so much in the access filter, but in the fact that a minor exposed on the platform has no defense or protection mechanisms”. He believes that “we need to invest in making it a safe space through control over content creation and effective and extensive moderation of activities”.
The Spanish Data Protection Agency (AEPD) has opened proceedings against Roblox for “pornographic or violent content” that it allegedly allows children to access “without restrictions” and for processing personal data of 13- and 14-year-olds, who are legally minors. The AEPD is also investigating that “you may register as a new user using a date of birth, a fictitious username, and a password.” The agency told EL PAÍS that it could not provide information on ongoing cases.
British research agency Revealing Reality evaluated the platform before and after the new parental controls and found some improvements. However, he noted that age verification remains insufficient, as adults can easily infiltrate child-only spaces, and chat and voice moderation are not entirely effective.
Their recommendation is clear: “Don’t rely on platform promises or research reports, including ours, to decide whether Roblox is right for your child. Create your account; it only takes a few minutes. Spend time in the spaces used by your child. Listen to the conversations. Then make your own informed decision.”
Sign up to our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
