Meta put virtual-reality profit over kids’ safety, whistleblowers tell US Congress

Meta’s virtual reality ambitions came under intense scrutiny on Capitol Hill this week, as two former company researchers alleged the tech giant prioritized growth and revenue over safeguarding young users in its immersive platforms.

Testifying before a Senate panel focused on privacy and technology, former user experience researcher Cayce Savage said internal findings that indicated children were active in Meta’s VR spaces—and were encountering sexually explicit content—were sidelined or shut down. Savage told senators the company downplayed risks to minors and, in some cases, discouraged investigations into youth harms so it could maintain plausible deniability about what children were experiencing in VR.

“The company cannot be relied upon to be transparent about how its products are used or how safe they are,” Savage said, describing case studies that included bullying, sexual harassment and assault, and requests for explicit images directed at minors inside virtual environments.

Jason Sattizahn, formerly of Meta’s Reality Labs, echoed those concerns. Pressed by Senator Marsha Blackburn about reports that the company’s AI chat systems could engage in romantic or sensual exchanges with children, Sattizahn said he was not surprised. His remarks reinforced broader worries that moderation and safeguards around AI-powered interactions still lag behind product rollouts and engagement goals.

Meta has disputed the characterization put forth by the former employees. Company spokesperson Andy Stone said the allegations rely on selectively leaked materials arranged to create a misleading picture, and he rejected the notion that there was any blanket prohibition on researching young users. The company has also said previously that illustrative examples of problematic chatbot behavior did not align with corporate policy and were removed once identified.

The testimony sharpened lawmakers’ focus on how VR and mixed-reality ecosystems—praised for their potential in gaming, education, and collaboration—also create complex safety challenges. Unlike traditional social media feeds, virtual spaces combine voice chat, embodied avatars, and proximity interactions, raising the stakes for moderating harassment, grooming, or explicit content in real time. Age verification, parental controls, and enforcement tools remain uneven across immersive platforms, and critics say Meta has been too slow to implement robust protections by default.

Blackburn used the hearing to renew calls for federal safeguards, arguing that the accounts from the former researchers highlight the urgency of passing the Kids Online Safety Act. The measure, which previously cleared the Senate but did not advance in the House, would place new responsibilities on platforms to mitigate harms to minors and provide more controls for families.

For the VR industry, the stakes are growing. Meta has invested heavily in hardware, software, and content ecosystems through Reality Labs to drive mass adoption. But as the user base diversifies—and as younger users inevitably find their way into social VR spaces—regulators are demanding stronger, measurable protections. The hearing underscored that lawmakers expect companies to demonstrate not just policies on paper but effective enforcement, transparency, and a willingness to surface inconvenient findings from internal research.

Key questions emerging from the session include whether Meta suppressed or narrowed research scopes that might have exposed systemic risks to minors; how quickly the company escalated reports of abuse in VR environments; and whether product and policy teams have the mandate and resources to address youth safety before launching new features. The whistleblowers argued that without clear accountability, safety work remains vulnerable to business pressures.

Meta’s response suggests it will continue to push back on claims of negligence while emphasizing policy updates and the removal of specific problematic examples. Still, the company faces a broader crisis of confidence as lawmakers weigh new rules for immersive technologies and AI-driven interactions. With virtual reality moving from niche to mainstream, the debate over how to police behavior in embodied digital spaces—and who bears responsibility when things go wrong—has never been more urgent.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Unlock Your Power: The Ultimate Guide to Pixel Blade Codes for May 2025

Pixel Blade Codes (May 2025) The fate of the realm rests in…

Unraveling Gen Z Slang: A Guide to Understanding ‘Zoomer’ Language and Expressions

Deciphering Gen Z Jargon: A Guide to Staying Hip It’s a whirlwind…

Unleashing Speed: A Comprehensive Review of Vivo Y29 5G’s Performance and Features

Unleash the Speed: A Deep Dive into the Vivo Y29 5G’s Power…

Exploring Genres and Roles: Arjun Rampal’s Exciting Lineup of Upcoming Projects

Rana Naidu 2, Blind Game To 3 Monkeys – Arjun Rampal Is…