Roblox is rolling out new safety features that use AI to scan video selfies of teens to verify their ages, enabling access to unfiltered chatting through a new “Trusted Connections” system. The platform aims to keep users engaged while addressing mounting concerns about child safety, following reports of at least 24 arrests related to grooming and abuse on the platform.
What you should know: The new age verification system requires users to submit video selfies that are analyzed by AI against a “diverse dataset” to estimate age.
- Users who appear under 13 automatically lose access to age-inappropriate features, while those whose ages cannot be determined with “high confidence” must use ID verification to proceed.
- Teen users (13-17) who pass verification can add other teens as Trusted Connections, but adding adults requires in-person QR code scanning or phone number verification.
- The company deletes biometric data after 30 days, except when required by warrant or subpoena.
How Trusted Connections works: Verified users gain access to unfiltered voice and text chat with people they know, removing restrictions on inappropriate language and personally identifiable information.
- Communications remain subject to Roblox’s community standards and moderation for predatory behavior, sexualization of minors, and inappropriate sexual conversations.
- The feature only applies to party voice and text chats, not other platform interactions.
- Parents can access dashboards to see their child’s trusted connections, though they won’t receive proactive notifications.
The safety concerns: Online safety experts argue the new system places too much responsibility on minors to identify and manage risks.
- “Predators frequently use real-world grooming tactics,” says Kirra Pendergast, CEO of Safe on Social, an online safety organization. “A QR scan doesn’t verify a trusted relationship. A predator could build trust online, then manipulate the child into scanning a QR code offline.”
- Machine learning age estimation tools can incorrectly categorize users, while the feature only applies to chat, “leaving large surface areas exposed making it a brittle barrier at best.”
By the numbers: Roblox serves approximately 98 million users across 180 countries, with over 60% being over age 13, according to chief safety officer Matt Kaufman.
What they’re saying: Roblox executives frame the changes as proactive safety measures rather than reactive responses to criticism.
- “What we’re doing with this announcement is also trying to set the bar for what we think is appropriate for kids,” Kaufman told WIRED.
- “We feel that we’re really setting the standard for the world in what it means to have safe, open communication for a teen audience.”
- Pendergast counters that effective safety requires systemic changes: “Age estimation, parental dashboards, and AI behavioral monitoring must be default, not optional, creating a baseline of systemic defense.”
Why this matters: The updates come as Roblox faces intensifying scrutiny over child safety, including multiple lawsuits and a 2024 Bloomberg report documenting arrests of at least 24 people who used the platform for grooming, abuse, or abduction. By keeping communications within Roblox rather than losing users to platforms like Discord, the company maintains oversight while potentially exposing minors to new risks through reduced content filtering.
Roblox’s New Age Verification Feature Uses AI to Scan Teens’ Video Selfies