Oversight board says Facebook’s automated image deletions are broken

Meta’s oversight board said the company should be more careful with automated moderation tools, criticizing it for removing a cartoon depicting police violence in Colombia. The decision came as the council tackled a series of new cases, including a question about a sexual assault video in India.

The Oversight Council, a semi-independent body funded by Meta, considered a political cartoon depicting Colombian police beating a man with batons. The cartoon was at some point added to Meta’s Media Matching Service database, meaning Meta’s system automatically flagged it for removal when users posted it. But when users had their posts taken down, they started appealing the decision — and winning. The Supervisory Board says 215 people appealed the withdrawal and 98% of the appeals were successful. Meta, however, did not remove the cartoon from its database until the Oversight Board took up the matter.

Automation can amplify the effects of a bad moderation call

This fact troubled the Supervisory Board. “By using automated systems to remove content, Media Matching Service banks may amplify the impact of incorrect decisions made by individual human reviewers,” the decision states. A more responsive system could have solved the problem by triggering a bank review when individual messages containing the image were successfully called. Otherwise, images banned on the basis of a wrong decision could remain secretly banned indefinitely, even when individual reviewers later come to a different conclusion.

This is one of many cases from the Oversight Board asking whether Facebook and Instagram’s automated moderation is calibrated to avoid overly aggressive takedowns, and as in previous cases, the Oversight Board wants more methods of, well, monitoring. “The council is particularly concerned that Meta does not measure the accuracy of Media Matching Service banks for specific content policies,” it notes. “Without this data, which is crucial to improving the operation of these banks, the company cannot say whether this technology works more effectively for certain community standards than for others.”

It asks Meta to publish error rates for content mistakenly included in the corresponding bank. As usual with board policy recommendations, Meta must respond to the suggestion, but may choose to implement it or not.

Meta’s sanctions for praising extremist groups are ‘confusing and harsh’

The Oversight Board also addressed one of several incidents testing Facebook’s line between supporting extremist groups and reporting about them. He determined that Meta erred in deleting an Urdu Facebook post reporting on the reopening of Taliban schools and colleges for women and girls. The rule prohibits “praise” from groups like the Taliban, and the post was removed. He was referred after a call to a special moderation queue but was never reviewed – the Oversight Board notes that at the time Facebook had less than 50 Urdu language reviewers assigned to the queue. ‘waiting.

The case, according to the council, “may indicate a wider problem” with the rules on dangerous organizations. After several incidents, he says the policy seems unclear to both users and moderators, and the penalties for violating the rule are “unclear and severe.” He asks for a clearer and narrower definition of “praising” dangerous individuals and devoting more moderators to the review queue.

Meanwhile, the Oversight Council is seeking public comment on two cases: The first concerns a video of a mass shooting at a Nigerian church, which was banned for violating the “violent and graphic content” policy of Meta, but which might have had informational value which should have been justified. maintain it. Likewise, he is interested in whether a video depicting sexual assault in India should be allowed to raise awareness of caste and gender based violence or whether its graphic depiction of non-consensual touching is too harmful on its own. The comment window for both cases will close on September 29.

Leave a Comment