Meta has introduced stricter measures to curb the spread of unoriginal content on Facebook, targeting fake accounts that consistently repost others’ videos, photos, or text without meaningful edits, attribution, or originality.
In a statement released Monday, the company revealed it has removed around 10 million accounts in 2025 for impersonating popular creators. An additional 500,000 accounts have faced penalties for spam behavior or generating fake engagement.
“These accounts will see reduced reach and be ineligible for Facebook’s monetization programs,” Meta announced, warning that repeat violators could lose distribution privileges entirely.
Curbing Low-Effort and AI-Generated Content
Meta’s move follows YouTube’s recent clarification on reused and AI-generated content, as platforms face growing concern over the rise of low-effort, mass-produced media—often referred to as “AI slop.” These include videos stitched together with random visuals, synthetic voiceovers, or copied content with minimal editing.
Although Meta’s statement does not directly name AI, it warns against uploading stitched clips or content overlaid with watermarks—common features of AI-generated or scraped material.
Instead, creators are urged to focus on authentic storytelling, original commentary, and high-quality captions, aligning with Facebook’s broader push for originality and engagement based on meaningful contribution.
Not Targeting Creative Engagement
Meta clarified that the policy does not apply to users who remix or creatively engage with content—such as through reaction videos, commentary, or trend participation. The primary focus remains on accounts that replicate existing material without adding value.
As part of its enforcement, duplicate videos will be demoted in user feeds, and Meta is testing a feature that will redirect viewers to the original source when duplicate content is detected.
New Tools and Transparency for Creators
To help users navigate the changes, Facebook will roll out post-level insights showing whether a piece of content has been deprioritized and why. These alerts will be accessible through the Professional Dashboard, enabling creators to track performance and avoid violations.
Fake Accounts and Algorithm
This shift comes as Meta faces criticism over content moderation—particularly on Instagram—where users have complained about algorithm-driven account removals and the absence of adequate human support. A petition demanding reforms to Meta’s enforcement system has already gathered nearly 30,000 signatures, reflecting growing frustration among creators and small businesses.
Fighting Fake Accounts at Scale
In its latest transparency update, Meta reported that 3% of Facebook’s global monthly active users are fake accounts. Between January and March 2025, the platform acted against one billion fake profiles.
In a related initiative, Meta is piloting a U.S.-based version of Community Notes, similar to a feature on X (formerly Twitter). This crowdsourced system allows users to flag misleading or out-of-context posts and determine alignment with Meta’s content standards.
Meta said the enforcement of its updated content policies will be phased in gradually, giving creators time to adjust and adapt to the platform’s evolving expectations around originality and quality.

