Facebook’s parent company Meta says its rules on what content is allowed or not on its platform, such as hate speech and harassment, apply to everyone.
But a board tasked with reviewing some of Meta’s toughest content moderation decisions on Tuesday said the social media giant’s claim was “misleading.”
In 2021, Meta asked the Oversight Council to review a program called cross-checking that allows celebrities, politicians and other high-profile users on Facebook and Instagram to get additional scrutiny if their content is flagged for breaking the rules. of the platform. The Wall Street Journal revealed more details about the program last year, noting that the system protects millions of high profile users from the way Facebook typically enforces its rules. Brazilian soccer star Neymar, for example, was able to share nude photos of a woman who accused him of rape with tens of millions of his fans before Facebook removed the content.
In a 57-page policy advisory on the program, the Oversight Council identified several flaws in Meta’s cross-checking program, including the fact that it gives some high-level users more protection. The review also raises questions about whether Meta’s program works as expected.
“The notice details how Meta’s cross-checking program prioritizes influential and powerful users of commercial value to Meta and, as structured, fails to meet Meta’s human rights responsibilities and values of the company, with profound implications for users and global civil society,” Thomas Hughes, director of the Oversight Board Administration, said in a statement.
Here’s what you need to know about Meta’s cross-checking program:
Why did Meta create this program?
Meta says the cross-checking program aims to prevent the company from mistakenly taking action against content that does not violate its policies, especially in cases where there is a higher risk of error.
The company said it applied this program to messages from media, celebrities or governments. “For example, we cross-referenced the account of an American civil rights activist to avoid mistakenly deleting examples of hate speech awareness he encountered,” Meta said in a 2018 blog post.
The company also provides more details on how the program works in its Transparency Center.
What problems did the board find?
The council concluded that the program resulted in “unequal treatment of users” because content flagged for further human review stays on the platform longer. Meta told the board that the company could take longer than five days to make a decision on which user content is cross-checked.
“This means that due to cross-checking, content identified as violating Meta rules is left on Facebook and Instagram when it is most viral and could cause harm,” the opinion reads.
The program also appears to benefit Meta’s business interests more than its commitment to human rights, according to the opinion. The board pointed to transparency issues with the program. Meta doesn’t tell the public who’s on its cross-check list and fails to track data on whether the program actually helps the company make more accurate content moderation decisions.
The board asked Meta 74 about the program. Meta fully answered 58 of the questions and partially answered 11. The company did not answer five questions.
What changes did the board recommend to Meta?
The council made 32 recommendations to Meta, noting that it should prioritize content important to human rights and review such users in a separate workflow from its business partners. A user’s number of followers or celebrity status shouldn’t be the only factor for additional protection.
Meta should also remove or hide very serious content that is flagged for breaking its rules on first review while moderators re-review the post.
“This content should not be allowed to remain on the platform accumulating views simply because the person who posted it is a business partner or celebrity,” the opinion stated.
The board also wants Meta to be more transparent about the program by publicly marking certain cross-check protected accounts, such as state actors, political candidates, and business partners, so the public can hold them accountable to know s they respect the rules of the platform. Users should also be able to appeal counter-verified content to the board.
How did Meta respond to the board’s advice?
The company said it was reviewing the board’s advice and would respond within 90 days.
Meta said that over the past year it has worked to improve the program, including expanding cross-checks to all 3 billion users. The company said it uses an algorithm to determine if content poses a higher risk of being mistakenly deleted. Meta also noted that he established annual reviews to determine who received an additional level of review.