Roblox to enact child-safety changes giving parents more control

By Cecilia D’Anastasio and Olivia Carville | Bloomberg

Roblox Corp., the video-game platform popular with preteens, is instituting a number of reforms following widespread criticism of its child-safety policies and the arrests of alleged child abusers using the service.

Users under the age of 13 will need parental permission to access certain Roblox chat features, according to a copy of an email the company sent to parents this week. Kids under 9 will also need permission to play games with moderate violence or crude humor, according to the email, which was reviewed by Bloomberg News.

The video-game company, based in San Mateo, California, is also introducing a new type of account that will let parents monitor their child’s online activity and friends.

A Roblox spokesperson declined to share more information about the changes, which the company plans to implement next month.

The moves “are part of Roblox’s commitment to making the platform one of the safest online environments for our users, particularly the youngest users,” the company said in an emailed statement.

The changes follow Roblox’s July announcement that it would label games based on the type of content they contain, rather than by age.

Related Articles

Business |


Privacy concerns raised over policy requiring California high school students to scan a QR code before using the bathroom

Business |


A tale of two turfs: Bay Area residents split over using artificial grass

Business |


Within days of getting state license, new crisis erupts at Santa Clara County foster group home

  New music: Here are 3 cool albums for fans to check out this month

Business |


One-month-old California infant dies after being bitten by dog

Business |


California joins 13 other states in lawsuit against TikTok over alleged harm to children’s mental health

Roblox’s child-safety policies have come under scrutiny in recent months. Since 2018, US police have arrested at least two dozen people who were accused of abusing or abducting victims they met or groomed on Roblox, according to a July Bloomberg Businessweek investigation. Some of those predators were already on sex-offender registries or had been accused of abusing minors offline. Roughly 40% of Roblox’s 80 million daily active accounts are under age 13.

Unlike competitors, Roblox until recently let children of any age establish accounts and talk to strangers. Child-safety advocates have criticized the ease with which kids on Roblox can chat with people they don’t know — an issue the changes may address. As of September, Roblox no longer lets children under age 5 set up accounts.

Following the Businessweek investigation, short seller Hindenburg Research published a report saying Roblox doesn’t do enough to stop child predators from accessing the platform. The report also accused Roblox of inflating its user numbers and time played, although some industry observers questioned that claim. Days earlier, short seller the Bear Cave issued a similar report alleging Roblox facilitated “widespread child abuse.”

A Roblox spokesperson described the Hindenburg report as “misleading,” saying the authors “neglected to accurately report on the company’s public disclosures.”

“We firmly believe that Roblox is a safe and secure platform and in the financial metrics we report,” the spokesperson said at the time.

  Vigilante justice or tragic accident? Trial opens in killing of homeless man in downtown Oakland

In an interview earlier this year, Roblox’s chief safety officer, Matt Kaufman, rejected the idea that Roblox has a systemic child safety problem and said the platform’s moderation systems scan all chat and digital content for inappropriate content.

More stories like this are available on bloomberg.com

©2024 Bloomberg L.P.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *