Meta, the parent company of Facebook, Instagram, and WhatsApp, has been urged to revoke its ban on the Arabic word “shaheed” (meaning martyr) on its platforms, while still maintaining restrictions on describing individuals as “terrorists.”
The Oversight Board of Meta, which operates independently despite being funded by Meta, issued an advisory opinion stating that the policy of categorizing “shaheed” under dangerous organizations and individuals unduly limits freedom of expression.
The board suggested that Meta’s current approach is unwarranted and recommended ending the blanket ban. This decision follows years of criticism regarding Meta’s management of Middle Eastern content, including a 2021 study commissioned by Meta itself, which found that its policies had a negative impact on Palestinians and other Arabic-speaking users.
Criticism intensified during the Israel-Hamas conflict in October, with rights groups accusing Meta of suppressing pro-Palestinian content on Facebook and Instagram. The Oversight Board emphasized that Meta’s handling of recent events, such as the Gaza conflict, demonstrated the necessity of reevaluating its policies to uphold human rights during crises.
In its report, the Oversight Board concluded that Meta’s rules regarding “shaheed” failed to consider the word’s diverse meanings, resulting in the removal of content unrelated to glorifying violence. Co-chair of the board, Helle Thorning-Schmidt, stated that Meta’s reliance on censorship as a means to improve safety has proven ineffective and may marginalize entire populations.
While acknowledging that “shaheed” can sometimes be used to glorify violent acts, the board emphasized its widespread usage in reporting, academic discussions, and human rights debates to refer to individuals who die serving their country or as victims of sociopolitical violence or natural disasters. The board cautioned against indiscriminate content removal, highlighting the potential counterproductive effects of censorship.