A court filing in California has revealed that nearly one in five Instagram users aged 13 to 15 reported seeing nudity or sexual images they did not want to view, according to a March 2025 deposition from Instagram chief Adam Mosseri.
The document, made public Friday as part of a federal lawsuit and reviewed by Reuters, cited a 2021 internal survey of young users. Andy Stone, a spokesperson for Meta, said the findings came from user-reported experiences rather than a direct review of platform content.
Mosseri said during his deposition that Meta does not typically release survey data and cautioned that self-reported surveys are โnotoriously problematic.โ Nevertheless, the disclosure adds to mounting scrutiny of the companyโs impact on minors.
Legal Pressure and Policy Changes
Meta, which owns Facebook and Instagram, faces thousands of lawsuits in U.S. federal and state courts. Plaintiffs accuse the company of designing addictive products and contributing to a mental health crisis among young users. Global leaders have also raised concerns about the safety of minors on social media platforms.
In response, Meta announced in late 2025 that it would remove images and videos containing nudity or explicit sexual activity for teen users, including AI-generated material. The company said it would allow limited exceptions for medical or educational purposes.
Additional Safety Concerns
The deposition also revealed that about 8% of users aged 13 to 15 reported seeing someone harm themselves or threaten self-harm on Instagram.
Mosseri noted that most sexually explicit content was shared through private messages, complicating moderation efforts. He emphasized that Meta must balance safety enforcement with user privacy, stating that many users do not want their messages reviewed.

