The Netherlands Authority for Consumers and Markets (ACM) has opened a formal investigation into US-based gaming platform Roblox, assessing whether it adequately protects minors under the European Union’s Digital Services Act, with child safety and online risk controls at the centre of the probe.
Underage children in the EU are at risk due to the way Roblox has been designed and developed, and that it does not have “appropriate and adequate measures” in order to protect children who use the platform, in accordance with the Digital Services Act. Roblox has a huge following, particularly among younger people; it receives tens of millions of players every day from around the world.
Furthermore, the ACM observed that numerous reports had been submitted regarding concerns with Roblox and that the ACM received those reports and has requested information from Roblox as part of a preliminary investigative inquiry during the last couple of months.
The ACM assessed all of the materials received and determined there is enough cause and reason to open and commence an official investigation of Roblox for potential DSA violations.
“ACM, too, has received reports about these issues, and, over the past few months, has requested information from the platform as part of a preliminary investigation,” the regulator said.
“Having assessed this information, ACM sees sufficient reason to launch an official investigation into Roblox for possible violation of the rules.” ACM.
This comes as the EU’s digital Services Act sets high a bar for child protection. The DSA mandates that the platforms must provide, among other things, protection of children’s privacy, safety, and security in a manner that is commensurate with the protection of other adults. The ACM is responsible for enforcement of the DSA in the Netherlands.
The regulator has revealed that the investigation will likely last about 12 months and has also stated that any potential harm from this investigation should be far larger than just financial harm. ACM has also stated that they cannot comment further at this time as the investigation is ongoing.
Roblox is working with the regulator, and Roblox has stated that they are compliant with the EU law. A representative stated that Roblox is “committed to compliance with the EU Digital Services Act.”
Early last year, as reported by Cryptopolitan, ByteDance-owned TikTok introduced new technology for age checks in Europe on the back of rising concerns from regulators about children using social media platforms.
The video-sharing app at the time said it will soon introduce the technology to ensure users under the age of 13 are identified more accurately and removed where necessary.
Roblox said that they have taken measures toward age verification via facial recognition to help to stop communication between children and adults, as recently announced in November.
“We look forward to providing the ACM with additional information about the many policies and safeguards that we have in place to protect children.” Roblox spokesperson.
The company must work on its data protection and age protection such as what Grok did in the Philippines. Grok assured the Philippines authorities of improved safety measures, leading to the country agreeing to restore access to the AI chatbot, but regulators signaled continued tougher oversight.
Authorities in the Philippines said that this decision came after the developer committed to remove image-manipulation features from the platform that triggered concern and prompted a temporary block.
Roblox has also faced several lawsuits in the US and has received significant criticism on a worldwide basis for their inability to protect children on their systems.
The smartest crypto minds already read our newsletter. Want in? Join them.


