NEW YORK : Meta’s ( META.O ) supervisory board on Tuesday called on the company to end a blanket ban on the use of the Arabic word “shaheed” or “martyr” in English after a year-long review found the Facebook owner’s approach was “ overbroad” and needlessly suppressed the speech of millions of users.
The board, which is funded by Meta but operates independently, said the social media giant should only remove posts containing the word “shaheed” if they are associated with clear signs of violence or if they separately violate other Meta rules.
The decision comes after years of criticism of the company’s handling of content involving the Middle East, including a 2021 study commissioned by Meta that found its approach had an “adverse human rights impact” on Palestinians and other Arabic-speaking users of its services .
This criticism has escalated since hostilities between Israel and Hamas began in October. Human rights groups have accused Met of suppressing pro-Palestinian content on Facebook and Instagram against the backdrop of a war that has killed tens of thousands of people in Gaza after Hamas’s deadly raids on Israel on October 7.
The Meta Oversight Board reached similar conclusions in its report on Tuesday, finding that Meta’s rules for “shaheed” failed to take into account the word’s diversity of meanings and led to the removal of content that was not aimed at glorifying acts of violence.
Also read: Govt establishes National CERT to counter cyber attacks
“Meta operates on the premise that censorship can and will improve security, but evidence suggests that censorship can marginalize entire populations without improving security at all,” Supervisory Board Co-Chair Helle Thorning-Schmidt said in a statement.
Meta is currently removing all posts using “shaheed” in references to individuals it labels on its list of “dangerous organizations and individuals,” which includes members of Islamist militant groups, drug cartels, and white supremacist organizations.
The company says the word represents praise for those entities, which it bans, according to the board’s report.
Hamas is among the groups the society labels a “dangerous organization.”
Meta sought board input on the topic last year after launching a policy review in 2020, but was unable to reach consensus internally, the board said. In its application, it revealed that the term “shaheed” accounted for more content removals on its platforms than any other single word or phrase.
A Meta spokesperson said in a statement that the company will review the board’s feedback and respond within 60 days.