Newly revealed court filings have raised serious concerns about Meta’s internal handling of research into the mental health effects of its platforms. The documents suggest that Meta uncovered causal links between Facebook and Instagram use and negative emotional outcomes but did not disclose the findings.
Internal Study Found Clear Signs of Harm
In 2020, Meta launched an internal research effort known as Project Mercury. The project aimed to understand the psychological impact of reducing time spent on Facebook and Instagram. According to the filings, Meta worked with a major survey company to measure how users felt after temporarily deactivating their accounts.
The results disappointed company leaders. Users who stayed off the platforms for a week reported lower levels of depression, anxiety and loneliness. They also experienced a decline in social comparison.
Although the study revealed meaningful patterns, the filings state that Meta stopped the project instead of expanding it. Internally, staff assured senior leadership that the conclusions were sound. One researcher even noted that the study showed a causal impact on social comparison. Another staff member warned that ignoring harmful findings would mirror the behaviour of industries that hid damaging evidence from the public.
Public Messaging Contradicted Internal Knowledge
Despite the internal research, the filings claim Meta later told lawmakers that it lacked the ability to measure potential harm to teenage girls. This apparent contradiction is now central to the lawsuit.
In response, a company spokesperson said the research had flawed methodology and that the organisation has consistently worked to protect young users. He maintained that the company has introduced meaningful safety improvements over the past decade.
Broader Allegations of Concealed Risks
The lawsuit includes several other accusations against major social media platforms. Plaintiffs argue that key companies hid known risks from users, parents and educational institutions.
The filings outline allegations that Meta:
-
Designed youth safety tools that were rarely used or ineffective
-
Set unusually high thresholds before removing accounts engaged in harmful activity
-
Recognised that optimised engagement among teens increased exposure to harmful content
-
Delayed protective measures for minors due to growth concerns
The documents also cite internal discussions where senior leaders appeared hesitant to prioritise child safety over other projects.
Legal Process Moves Forward
The underlying internal papers referenced in the complaint remain sealed. Meta has requested limits on what the plaintiffs can unseal, citing overly broad demands. A court hearing on these issues is scheduled for January 26 in the Northern District of California.
A Debate With Far-Reaching Implications
These allegations have amplified ongoing debates about the responsibilities of social media companies. The case raises questions about transparency, user protection and the potential psychological impact of widely used digital platforms.
As the legal process unfolds, it is expected to shape future discussions on social media regulation and youth mental health safeguards.

