17.8 C
New York
Wednesday, September 10, 2025

Meta in hot seat again over whistleblower safety allegations



Meta is facing Congress’s ire once again over its approach to online safety, after several current and former employees came forward with allegations that the tech giant attempted to “bury” findings about safety concerns across its platforms — particularly newer virtual and augmented reality products. 

Six current and former Meta employees detailed concerns about the company’s handling of user data and its approach toward safety research following previous whistleblower complaints in documents shared with Congress. 

“Meta has knowingly, willfully, intentionally swung the door wide open on exposing these children to social media harms when they are on their platform,” Sen. Marsha Blackburn (R-Tenn.) said Tuesday ahead of a hearing with two of the whistleblowers, former Meta researchers Jason Sattizahn and Cayce Savage. 

They have accused Meta of doctoring and restricting research into safety concerns in an effort to avoid legal liability, noting a “vast and negative change” following the revelations by Facebook whistleblower Frances Haugen in 2021. 

Haugen appeared before Congress nearly four years ago, alleging the company was aware of the negative impacts of its platforms on young users but chose to prioritize profits over people.  

After these revelations, the whistleblowers said several research areas, including youth and product harm issues, were deemed “sensitive” and came under the scrutiny of Meta’s legal team. 

“Put differently: after Ms. Haugen exposed Meta’s internal research which established leadership’s explicit knowledge about the platform’s harms toward children, Meta redefined the scope of the research in order to establish plausible deniability while simultaneously publicly stating that it has increased tools and systems to mitigate those harms,” a disclosure to Congress reads. 

One whistleblower, referred to as Alpha to protect their anonymity, sought to study virtual reality (VR) users’ understanding of safety tools but was allegedly told by Meta’s legal team not to record data from participants who discussed harms. If this was captured, they were directed to delete the data or eject the participants from the study. 

Shortly after, Alpha’s manager reportedly insisted on running a VR study through a third-party vendor, which they said was “necessary to erase risky data collected in the study.” 

Amid a push to lower the minimum age for its VR products to children as young as 10 years old, another whistleblower, referred to as Beta, alleged that Meta’s legal team slowed down research into age verification and eventually canceled the project with no explanation. 

Another survey about Meta VR harms conducted by Alpha allegedly faced “heavy restrictions,” including requirements that it be run through a third-party vendor, eliminate questions and responses deemed risky and avoid data collection on harms to young users. 

They were reportedly later directed to remove survey questions about emotion, well-being and psychological harm, in addition to removing or editing responses about sexual harm. 

A Meta legal team member, Kristen Zobel, justified the restrictions “by stating that the company did not want to have data showing the psychological and emotional harm its products generate if Meta was audited and in light of public opinion and previous ‘leaks,’” according to the disclosure. 

A separate whistleblower, referred to as Charlie, raised concerns to Meta’s director of VR research, Tim Loving, about a directive from the legal team not to collect data on mentions of VR users younger than 13.  

After saying this made them feel “icky,” Loving reportedly responded that they were “going to have to swallow that ick.” 

Sen. Richard Blumenthal (D-Conn.) on Tuesday took aim at the Metaverse, the company’s virtual world accessible through its virtual and augmented reality products. 

“Metaverse is a cesspool, filled with pedophiles, exploiters, groomers, traffickers. And Meta knows it,” he said. “They know it, and they have stifled and suppressed the research and truth telling that would provide Congress with all of the facts that are needed to support the Kids Online Safety Act and other measures that will protect children.” 

Blumenthal and Blackburn have led the push to pass the Kids Online Safety Act, legislation that aims to regulate the features offered to children online and reduce the addictive nature and mental health impacts of platforms. 

“Meta took a lesson from Frances Haugen,” Blumenthal added. “It was the wrong lesson. Their lesson was no more documents, no more research, no more truth telling. We don’t want to see it, hear it.” 

Meta spokesperson Andy Stone pushed back on the allegations, arguing the claims are “nonsense” and based on “selectively leaked internal documents” selected to “craft a false narrative.” 

“The truth is there was never any blanket prohibition on conducting research with young people and, since the start of 2022, Meta approved nearly 180 Reality Labs-related studies on issues including youth safety and well-being,” he said in a statement. 

The recent revelations come as the company, which has previously faced scrutiny over its approach to kids’ safety, has separately faced backlash over how its AI chatbots interact with children.  

Reuters reported last month that a Meta policy document featured examples suggesting its AI chatbots could engage in “conversations that are romantic or sensual” with children. 

Meta said this was an error and that it removed the language. It also told TechCrunch it was adjusting its approach to teen safety, training chatbots not to engage with young users on self-harm, suicide, disordered eating or inappropriate romantic conversations. 

Just last year, Meta CEO Mark Zuckerberg was hauled before Congress, alongside several other tech leaders, to discuss kids’ safety concerns. Following a heated exchange, Zuckerberg turned around to face parents and activists and apologized. 

“It was in this very room with this very committee, maybe it was in a different room but the same committee, where Mark Zuckerberg actually turned to some families who had lost children to drugs and said, ‘I’m sorry, I’m sorry this happened,’” Sen. Amy Klobuchar (D-Minn.) said at Tuesday’s hearing. “Well, sorry is not enough anymore.” 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe

Latest Articles