Meta’s Oversight Board to weigh in on company’s handling of Israel-Hamas war content

News Room
By News Room 5 Min Read

Meta’s Oversight Board on Thursday initiated a review of two content moderation decisions by the social media giant related to Israel-Hamas war content.

The move marks the first time the board has conducted an “expedited” review process, a nod to the intense scrutiny Facebook-parent Meta and other social media companies have faced over content related to the conflict. The board decided to take up a faster review in this case because content decisions related to the war could have “urgent real-world consequences,” it said in a statement.

In the weeks following Hamas’ attack on Israel, major social media platforms faced questions about whether they were hosting misleading and hateful content related to the conflict. European Union officials sent warnings to TikTok, Meta, YouTube and X (formerly Twitter), raising concerns about war-related content on their platforms and reminding the social media companies they could face billions of dollars in fines if an investigation later determines they violated EU content moderation laws. US and UK lawmakers also called on the platforms to ensure they were enforcing their rules against hateful and illegal content.

Meta told CNN in October that it had established “a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation,” and that it was coordinating with third-party fact checkers in the region.

In the weeks after the Israel-Hamas conflict broke out, Meta’s Oversight Board said it saw a nearly three-fold increase in daily average user appeals of decisions on content “related to the Middle East and North Africa region.”

Meta’s Oversight Board is an entity made up of experts in areas such as freedom of expression and human rights. It is often described as a kind of Supreme Court for Meta, as it allows users to appeal content decisions on the company’s platforms. The board makes recommendations to the company about how to handle certain content moderation decisions, as well as broader policy suggestions.

The Oversight Board plans to review one video that appears to show the aftermath of a strike outside the Al-Shifa Hospital in Gaza City posted to Instagram in early November, which “shows people, including children, injured or dead, lying on the ground and/or crying.” A caption under the video in Arabic and English referenced the Israeli army, stating that the hospital had been “targeted by the ‘usurping occupation.’”

Meta initially removed the post for violating its rules on graphic and violent content. A user appealed the decision, asking for the video to be restored, after the board decided to take up the case, Meta made the video viewable with a warning that the content is disturbing.

The board will also review a video that shows two hostages being kidnapped by Hamas militants — a woman on a motorbike and a man being marched away — in which the caption urges people to watch to gain a “deeper understanding” of the October 7 attack on Israel.

Meta initially removed the post for violating two policies: its rules against violence and incitement, which were temporarily revised to include content that clearly identified hostages, and its dangerous organizations and individuals policy that prohibits imagery of terror attacks on visible victims. (Meta designates Hamas as a dangerous organization under its policy and labeled the October 7 attack as a terrorist attack.) The company later reinstated the video with a warning screen “in response to trends in how hostage kidnapping videos were being shared and reported on.”

The board expects to render a decision on the cases within the next 30 days.

Meta said in a blog post Thursday that the board’s “guidance in these cases, along with feedback from other experts, will help us to continue to evolve our policies and response to the ongoing Israel-Hamas War.” It added that it plans to implement the board’s decision in each case.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *