Social MediaTechnology

Meta Accused of Burying Internal Research Showing Facebook and Instagram Harm Mental Health

New court filings allege that Meta Platforms knowingly suppressed internal research showing that its social media products—especially Facebook and Instagram—cause measurable harm to users’ mental health, particularly among teens. The explosive claims, made in unredacted legal documents, are part of a class-action lawsuit brought by U.S. school districts against Meta, TikTok, Google, and Snapchat.

According to the newly revealed documents, Meta conducted a 2020 internal study code-named “Project Mercury,” in collaboration with Nielsen. The research tested the effects of users deactivating their Facebook and Instagram accounts for one week. The results were damning: participants reported feeling less depressed, less anxious, less lonely, and less prone to social comparison.

But rather than publicly releasing these findings—or further pursuing the research—Meta reportedly shut the project down, claiming the results were skewed by media bias. Internally, however, employees acknowledged the study showed causal links between social media usage and mental health harm.

“The Nielsen study does show causal impact on social comparison,” one researcher wrote, followed by a sad face emoji. Another compared the company’s actions to the tobacco industry knowingly hiding the dangers of cigarettes.

Despite these internal admissions, Meta allegedly told Congress it had no way of quantifying whether its platforms were harmful to teen girls. In public, the company pushed a narrative of safety and concern, even as private documents suggested otherwise.

Internal Chaos, Public Denials

Meta spokesman Andy Stone responded by claiming the study was methodologically flawed and denied any cover-up. “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said Saturday.

But the broader complaint from plaintiffs—represented by law firm Motley Rice—paints a much darker picture. The lawsuit accuses social media giants of knowingly designing addictive and unsafe products, failing to protect minors from sexual predators, and manipulating child-focused advocacy groups for PR cover.

Among the most disturbing Meta-specific allegations:

  • Ineffective Youth Safety Tools: Meta allegedly designed safety features that were hard to use, rarely deployed, and blocked tests on more effective tools for fear of hurting growth metrics.

  • Lax Enforcement on Predators: Internal policy reportedly allowed users to attempt sex trafficking 17 times before being removed.

  • Growth Over Safety: Executives knowingly optimized algorithms that pushed harmful content to teens to drive engagement.

  • Delayed Action on Child Predators: Internal efforts to curb predator contact with minors were delayed or shut down over concerns about user growth.

  • Zuckerberg’s Priorities: CEO Mark Zuckerberg allegedly texted in 2021 that child safety wasn’t a top priority because he was more focused on the metaverse.

Stone denied these claims, stating that the company now removes sex trafficking accounts as soon as they are flagged and calling its teen safety systems “broadly effective.” He also accused the lawsuit of cherry-picking quotes and taking statements out of context.

Wider Pattern Across Platforms

The lawsuit also includes allegations against TikTok, Google, and Snapchat, though the filings are less detailed. TikTok is accused of sponsoring organizations like the National PTA to shape public perception and even claimed it could direct the group’s public statements in exchange for financial support.

In general, the complaint accuses all the platforms of:

  • Encouraging children under 13 to use their services

  • Failing to prevent exploitation and harmful content

  • Concealing internal research on mental health impacts

  • Influencing third-party organizations to cover for safety concerns

Next Steps

Meta is now fighting to keep the internal documents sealed. A hearing is scheduled for January 26 in U.S. District Court in Northern California to determine whether the documents will be made public.

If the court rules to unseal the documents, Meta may face one of the most significant transparency crises in its history—drawing parallels to Big Tobacco’s fall from grace decades ago.

Ad Blocker Detected!

Refresh