MENLO PARK, Calif. — A recent ruling by Facebook’s quasi-independent review board has called for Meta, the parent company of Facebook, to reverse its decisions regarding the removal of two posts related to the Israel-Hamas war. The board emphasized that these posts were playing a vital role in “informing the world about human suffering on both sides.”
Although Meta eventually reinstated the posts, it added warning screens due to their violent content. The review board has expressed disagreement with Meta’s decision to prevent these posts from being recommended on Facebook and Instagram, even when they were intended to raise awareness about the conflict. Furthermore, the board criticized Meta’s use of automated tools to remove potentially harmful content, as it often resulted in the removal of valuable posts that shed light on human-rights violations. It strongly urged Meta to preserve such content in the future.
The Oversight Board, which was established by Meta three years ago, issued these decisions in an expedited manner, taking only 12 days instead of the usual 90. In one case, Instagram removed a video depicting the aftermath of a strike on or near Al-Shifa Hospital in Gaza City. The post showcased injured or killed Palestinians, including children. Despite Meta ultimately reversing its decision, the board criticized the demotion and warning screen placed on the post, as it limited its visibility and reach to users.
The other case involved a video posted on Facebook showing an Israeli woman pleading with her kidnappers not to harm her during the Hamas raids in Israel. The Oversight Board highlighted these specific instances to underscore the importance of preserving valuable content related to the conflict.
Overall, this ruling has prompted discussions regarding the role of social media platforms in balancing the removal of harmful content with protecting freedom of speech and raising awareness about global events. Meta will need to carefully consider the implications of this decision and adjust its policies accordingly.
Users Appeal Meta’s Decision on Content Removal
At present, users have initiated appeals against Meta’s decision to remove certain posts, leading to the involvement of the Oversight Board. Notably, the board disclosed a significant surge in appeals concerning the Middle East and North Africa region following October 7th.
Acknowledging the Oversight Board’s decision, Meta expressed its acceptance.
“We highly value both expression and safety for our users. While Meta initially removed the content, the board overturned this decision. However, they approved our subsequent action of restoring the content with a warning screen. Therefore, no further measures will be taken,” stated the company. “As the board did not provide any recommendations, this case will not be subject to any future updates.”
During a briefing on these cases, the Oversight Board revealed that Meta had confirmed temporarily reducing the thresholds for its automated content detection and removal tools. This adjustment aimed to mitigate harmful content but inadvertently resulted in the inadvertent removal of non-violating and valuable content. As of December 11th, Meta had yet to revert these thresholds to their pre-October 7th levels.