According to “deeply disturbing” research, youngsters can easily come across harmful content and engage in unsupervised interactions with adults on the Roblox gaming platform.
It happened after parents expressed their grave worries about their kids becoming addicted, viewing traumatic material, and getting solicited by strangers on the wildly popular website and app.
Roblox admits that its website exposes kids to potentially hazardous information and “bad actors.” It claims that while it is making every effort to address this, government action and industry-wide cooperation are required.
READ MORE: Roblox And Google Collaborate On Advertisements
Roblox, which bills itself as “the ultimate virtual universe,” offers millions of games and interactive settings that are referred to as “experiences.” Roblox creates some of the material, but most of it is created by users. Over 85 million people used the platform every day in 2024, with an estimated 40% of them being under the age of 13.

The firm said that “tens of millions of people have a positive, enriching, and safe experience on Roblox every day,” even though it “deeply sympathized” with parents whose children were harmed on the site.
The digital behavior specialists Revealing Reality, however, found “something deeply disturbing … a troubling disconnect between Roblox’s child-friendly appearance and the reality of what children experience on the platform” in an analysis that was shared with the Guardian.
Several Roblox accounts were made by Revealing Reality and registered to fictitious users who were five, nine, ten, thirteen, and older than forty. To make sure their avatars’ behaviors were unaffected, the accounts only communicated with each other and not with anyone outside the experiment.

The researchers came to the conclusion that “there are still significant risks for children on the platform and the safety controls that exist are limited in their effectiveness,” even with the introduction of new tools last week that are intended to give parents more control over their children’s accounts.
According to the research, instances of adults and children engaging without proper age verification were discovered, and children as young as five were allowed to converse with adults while playing games on the platform. This remained the case even after Roblox changed its settings in November of last year, preventing accounts associated with under-13s from sending direct messages to other users outside of games or experiences. Instead, they can only send public broadcast messages.
READ MORE: Why Some Developers Are Outraged By The Rise Of Fraudulent Conduct In Fortnite And Roblox
Additionally, the study discovered that the 10-year-old’s account’s avatar had access to “highly suggestive environments.” These included a public restroom where characters were urinating and avatars could dress up in fetish accessories, as well as a hotel room where they could watch a female avatar in fishnet stockings gyrating on a bed and other avatars lying on top of one another in sexually suggestive poses.
Researchers discovered that when their test avatars used the voice chat feature, they could hear repeated sucking, kissing, and grunting sounds in addition to talks with other users that verbalized sexual activities. Roblox claims that real-time AI moderation is in place for all voice chat, which is accessible to phone-verified accounts that are registered as belonging to individuals who are 13 years of age or older.

Additionally, they discovered that a test avatar belonging to an adult could use barely coded language to request the five-year-old test avatar’s Snapchat information. Although Roblox claims that in-game text chat is governed by built-in filters and moderation, the research claims that this is an illustration of how readily these safeguards can be gotten over, opening the door for predatory behavior.
Although Roblox acknowledged that “there are bad actors on the internet,” it also stated that this was “an issue that goes beyond Roblox and needs to be addressed through collaboration with governments and an industry-wide commitment to strong safety measures across all platforms.”
Additionally, it recognized that age verification for children under the age of thirteen “remains an industry challenge.”
A 10-year-old boy who was groomed by an adult he met on the platform and a 9-year-old girl who began experiencing panic attacks after viewing sexual content while gaming are two examples of stories parents have shared in response to a Guardian Community callout.

According to Revealing Reality’s research director Damon De Ionno, Roblox’s new safety features, which were unveiled last week, fall short. How can parents be supposed to regulate when there are 6 million experiences [on the platform], many of which have erroneous ratings and descriptions, and children may still communicate with strangers who are not on their friend list?
The study revealed the platform’s “systematic failure to keep children safe,” according to cross-bench peer and internet safety advocate Beeban Kidron, who also added: “This kind of user research should be routine for a product like Roblox.”
“Trust and safety are at the core of everything we do,” stated Roblox Chief Safety Officer Matt Kaufman. To safeguard our community, particularly the youth, we are always improving our policies, technology, and moderation initiatives. This entails making investments in cutting-edge safety equipment, collaborating closely with professionals, and providing parents and caregivers with strong controls and tools.
“We have implemented over 40 new safety features in 2024 alone, and we are fully committed to continuing to make Roblox a secure and respectful environment for all users.”
Step into the ultimate entertainment experience with Radiant TV! Movies, TV series, exclusive interviews, live events, music, and more—stream anytime, anywhere. Download now on various devices including iPhone, Android, smart TVs, Apple TV, Fire Stick, and more!
