news-ca

Roblox Faces Lawsuits: AI Age Verification to Shield Kids in Chats

Roblox is introducing a new age verification process in response to rising concerns about child safety on its platform. This updated policy requires users to either submit a government-issued ID or utilize an artificial intelligence (AI) facial recognition tool. The initiative aims to enhance safety measures by preventing children from interacting with adult strangers during gameplay and chats.

Facilitating Safer Interactions

Roblox allows users to create and play online games, engaging in social interactions. The platform boasts over 150 million global users, with approximately one-third under the age of 13. However, increasing incidents of child grooming and abuse on the platform led to heightened scrutiny and legal action against the company.

Legal Challenges Faced by Roblox

  • Kentucky and Louisiana attorneys general have filed lawsuits claiming the platform endangers children.
  • Florida’s attorney general issued a criminal subpoena, labeling Roblox a “breeding ground for predators.”
  • Families, such as that of Becca Dallas, are suing Roblox following tragic incidents involving their children.

Roblox previously implemented various safety features, including parental controls and content moderation. Despite these efforts, the new age verification policy aims to significantly increase user accountability.

Implementation of AI Age Verification

Starting Tuesday, all users must verify their age before accessing chat features. Options include submitting a government ID or letting AI technology estimate their age through facial recognition. This technology, provided by Persona, utilizes the user’s device’s front camera to ensure proper verification procedures are followed.

How It Works

  • Users will be prompted to move their faces in specific directions during the verification process.
  • The AI categorizes users into age ranges: under 9, 9-12, 13-15, 16-17, 18-20, or 21+.
  • Users can interact only with individuals in or near their estimated age range.

For instance, a user identified as 12 years old may chat with those aged 15 and younger, but not with those over 16. If the age estimate is inaccurate, users over 13 can upload an ID to adjust their estimated age. Parents connected via Roblox parental controls can also rectify their child’s age.

Plans for Global Rollout

Roblox will initially implement this age verification system on a voluntary basis. However, it will become mandatory in Australia, New Zealand, and the Netherlands starting in December. A global rollout is expected to follow early next year.

Industry-Wide Safety Efforts

This move aligns Roblox with broader trends across digital platforms, where companies like YouTube and Meta are using AI for age estimation to protect young users. Roblox’s Chief Safety Officer, Matt Kaufman, emphasized the platform’s commitment to creating a safe environment for all users.

Roblox assures that any facial recognition images used will be for age estimation purposes only and will be deleted afterward. The company aims to limit interactions between minors and unknown adults, thereby enhancing user safety.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button