
Parents can now block their children from communicating with specific friends or playing certain games on Roblox, a popular online gaming platform for children.
The changes are part of a series of safety updates designed to give parents more control over their child’s experience on the platform.
Starting from Wednesday, parents and caregivers who verify their identity with an ID or credit card will have access to three new tools. The friend management tool allows them to block any account on their child’s friends list, preventing direct messages between the two, and to report accounts suspected of violating Roblox policies.
Parents can also review and adjust the content maturity level for their child’s account, which determines which games their child can access, and they can obtain detailed insights into their child’s screen time.
Under the Online Safety Act, which took effect this year, tech companies are required to address harmful content on their platforms or face fines of up to £18 million or 10% of their global revenue.
There have been reports of bullying and grooming on Roblox, and concerns that children are being exposed to explicit or harmful content on the site, which is the most popular platform in the UK among gamers aged eight to 12.
Roblox’s chief safety officer, Matt Kaufman, stated that safety is a core priority for the company and that their mission is to be “the safest and most civil online platform in the world.”
The US-based company is one of the world’s largest gaming platforms, with more monthly users than Nintendo Switch and Sony PlayStation combined. In 2024, the site averaged more than 80 million players a day, and approximately 40% of those users were under 13.
Roblox introduced 40 safety updates last year, including preventing users under 13 from sending direct messages. The company has also updated its voice safety technology, which uses a machine-learning model to moderate chat between players more accurately than human moderators.
Andy Burrows, the chief executive of the Molly Rose Foundation, welcomed the safety improvements but noted that “Roblox still needs to address significant problems with harmful and age-inappropriate content.”
He added, “Extensive research has shown that Roblox is filled with age-inappropriate games and communities, including depression rooms that can exacerbate distress and offer no support to vulnerable children.”
“This content raises fundamental questions about Roblox’s broader commitment to safety and shows that the company cannot rely solely on parental controls but must take decisive action to make the platform safe for its young users.”
Last month, Roblox’s co-founder and chief executive, David Baszucki, emphasized the platform’s vigilance in protecting its users and noted that “tens of millions” of people have had positive experiences on the site.
He also stated, “My first message would be: if you’re not comfortable, don’t let your kids be on Roblox. That might sound counterintuitive, but I would always trust parents to make their own decisions.”