
Research published by digital behavior experts Revealing Reality has highlighted the ease with which children can encounter inappropriate content and interact unsupervised with adults on the gaming platform Roblox. The study, which involved creating multiple Roblox accounts for users of various ages, found significant risks for children despite recent platform updates.
Parents have expressed serious concerns about children experiencing potential addiction, exposure to disturbing content, and interactions with strangers on the platform. Roblox acknowledges the potential for harmful content and "bad actors" on its platform and states its ongoing efforts to address these issues, emphasizing the need for industry-wide collaboration and government intervention.
Roblox, described as a virtual universe featuring millions of user-generated games and interactive environments, had over 85 million daily active users in 2024, with an estimated 40% under the age of 13.
While the company acknowledges the potential harm to some users, it also emphasizes the positive experiences of tens of millions of users. The Revealing Reality investigation, shared with various publications, found a disconnect between Roblox's perceived child-friendliness and the reality of children's experiences.
The researchers created accounts for users aged five, nine, ten, thirteen, and adults, and interacted only with each other to avoid influencing user behavior. Despite recent parental control tools, the researchers concluded that existing safety controls are limited in effectiveness, leaving children at risk.
The report found that children as young as five could communicate with adults while playing, and that age verification was not consistently enforced. This was despite Roblox implementing changes in November 2024 to restrict direct messaging between under-13 accounts and those outside games or experiences. The report also documented access to suggestive content, including avatars in sexually suggestive poses, and explicit conversations in voice chat.
Researchers noted instances where adults attempted to solicit personal information from younger users, highlighting the potential for predatory behavior. Roblox maintains that voice chat, available to verified users 13 and older, is subject to real-time AI moderation, and that text chat is moderated by filters. However, the report indicates that these measures can be circumvented.
Roblox acknowledged the presence of "bad actors" on the internet and the need for industry-wide collaboration and government intervention to improve safety measures across all platforms. The company also acknowledged that age verification for under-13s remains a challenge within the industry.
The research has prompted concerns from parents and safety advocates, with reports of children being groomed or experiencing distress due to encountering inappropriate content on the platform. Roblox's chief safety officer stated the company's commitment to evolving safety measures, including the addition of over 40 new safety enhancements in 2024, and working closely with experts to improve safety for all users.