Roblox Overhauls Content Moderation After Child Safety Lawsuits
Roblox cracks down on its user-created content following multiple child safety lawsuits
Roblox is implementing major policy changes for its user-created content. This comes after facing several lawsuits alleging that the popular gaming platform isn’t providing a safe environment for its younger players. The company recently shared details of these significant updates on its website, aiming to address growing concerns.
One of the most notable shifts is how “unrated experiences”—Roblox’s term for user-generated games—will be handled. Previously, users aged 13 and older could access these unrated games. Now, access will be strictly limited to the developer and those actively collaborating on the project. This significant change is set to roll out in the coming months.
To further combat inappropriate content, Roblox is introducing stricter rules for “social hangout” experiences, especially those depicting private spaces like bedrooms or bathrooms, or adult-only environments such as bars and clubs. These types of games will now be restricted to users who are at least 17 years old and have verified their identity.
In an effort to enforce these new guidelines, Roblox plans to launch an automatic detection tool. This tool will identify “violative scenes”—essentially, user activity that breaches the rules. If a server accumulates too many violations, it will be automatically taken offline. To get the experience back up and running, developers will need to work directly with the Roblox team to make necessary adjustments.
These extensive policy revisions are a direct response to the aforementioned lawsuits. For instance, in response to a lawsuit filed by Louisiana’s attorney general, Roblox issued a separate statement asserting its commitment to blocking exploitative behavior and continuously improving its moderation efforts.
The company firmly stated, “Any assertion that Roblox would intentionally put our users at risk of exploitation is simply untrue.” They also acknowledged the challenges, adding, “No system is perfect and bad actors adapt to evade detection, including efforts to take users to other platforms, where safety standards and moderation practices may differ.”