This browser does not support the Video element.
Roblox launches age checks amid safety concerns
FOX 13's Kylie Jones shares how a popular gaming app geared towards kids is stepping up its efforts to protect them from sexual predators.
TAMPA, Fla. - The popular gaming platform, Roblox, is rolling out a new feature to require age checks of its users. The app continues to face scrutiny over its safety features and protection of children using the app.
Roblox is facing dozens of lawsuits around the country related to the alleged abuse and sexual exploitation of children.
Timeline:
Last month, Florida Attorney General James Uthmeier issued criminal subpoenas to investigate whether the app is potentially helping predators gain access to children.
In the next few weeks, Roblox is launching a new safety feature aimed at protecting children.
What they're saying:
"We're going to have what we call age-based chat, which means that we require all users who want to be able to chat on the platform to go through a quick and easy age check," Dina Lamdany, a Senior Product Manager with Roblox said.
Product leaders with the app say this feature will help make sure its users are only able to chat with users of a similar age group.
"We use a third-party vendor that's safe and secure, and you do a quick selfie of your face to estimate your age," Lamdany said. "And based on that, we put you into an age group."
Lamdany says they're also adding new guides for parental controls within the app.
Product leaders with Roblox say they're continuously adjusting and developing safety features.
The app continues to face criticism over the effectiveness of these safety measures.
"Roblox is more or less a fertile hunting ground for child sexual predators," Matthew Dolman, of the Dolman Law Group said.
PREVIOUS: Florida attorney general issues criminal subpoenas to online gaming platform, Roblox
Dolman has filed 30 lawsuits against the app, including lawsuits out of Walton County and Brevard County. A third lawsuit that was filed in California involves a victim from Miami-Dade.
"All the cases, though, have the same theme," he said. "An adult masqueraded as a child, met the child, you know, consistent grooming behavior. In one case, the platform itself facilitated the actual physical abuse of a child."
Dolman questions why these safety features are being added now, as the technology has been available for years.
"There's still ways around it," he said. "And they'll, you know, children will use fake images to appear to be something different than they are to get around these safety measures where they're using artificial intelligence."
He says there needs to be a systemic change to these types of apps to better protect the safety of children.
CLICK HERE:>>> Follow FOX 13 on YouTube
The Source: Information for this story was provided by Roblox and Dolman Law Group.