La Era
AI

Dutch Regulator Probes Roblox Over Child Safety Risks Under EU Digital Services Act

The Netherlands Authority for Consumers and Markets (ACM) initiated a formal investigation into Roblox regarding potential exposure of underage users to violent and sexual content. The probe aims to assess Roblox’s adherence to the EU's Digital Services Act (DSA) concerning minor protection and will likely span one year. The regulator cited recurring reports of inappropriate material and predatory behavior as justification for the inquiry.

La Era

2 min read

Dutch Regulator Probes Roblox Over Child Safety Risks Under EU Digital Services Act
Dutch Regulator Probes Roblox Over Child Safety Risks Under EU Digital Services Act
Publicidad

The Netherlands Authority for Consumers and Markets (ACM) launched an official investigation into the popular online gaming platform Roblox on Friday to determine if adequate protections exist for children against violent and sexual imagery. This formal probe will examine potential risks faced by underage users within the European Union ecosystem and is projected to continue for approximately one year.

The ACM stated that the platform frequently generates news coverage due to concerns surrounding explicit content accessible to minors, alongside reports of malicious adults targeting younger users. Furthermore, the regulator expressed concern over misleading commercial techniques allegedly employed by Roblox to drive in-app purchases among children.

This regulatory action stems from the European Union’s Digital Services Act (DSA), which mandates that large online platforms implement proportionate and appropriate safeguards to ensure a high standard of safety and privacy for minors. The ACM views the accumulated allegations as sufficient grounds to formally assess potential breaches of these stringent digital rules.

Should the ACM conclude that Roblox has violated the DSA provisions, the regulator possesses the authority to impose significant enforcement measures. These potential consequences include issuing binding instructions, substantial fines, or other stipulated penalties against the gaming giant.

This scrutiny follows a precedent set in 2024 when the ACM fined Epic Games, the maker of Fortnite, 1.1 million euros ($1.2 million). That penalty was imposed after the ACM determined that the game exploited vulnerable children by pressuring them into making in-game purchases via the Item Shop.

A spokesperson for Roblox confirmed the company's commitment to upholding the EU Digital Services Act requirements. The company referenced its November announcement regarding the introduction of age verification, potentially utilizing facial recognition, to restrict adult-to-child communications.

The company expressed anticipation regarding providing the ACM with comprehensive details about the existing policies and safeguards implemented to shield minor users on the platform. The outcome of this investigation will set a significant precedent for compliance among major interactive entertainment platforms operating under the DSA framework.

This development signals an intensifying global regulatory focus on platform accountability concerning user safety, particularly within environments heavily populated by minors. The findings will inform how other EU member states interpret and enforce the child protection mandates embedded within the DSA.

Publicidad

Comments

Comments are stored locally in your browser.

Publicidad