VIP moderation system
Meta oversight board urges changes to VIP moderation system
Facebook parent Meta’s quasi-independent oversight board said Tuesday that an internal system that exempted high-profile users, including former U.S. President Donald Trump, from some or all of its content moderation rules needs a major overhaul.
The report by the Oversight Board, which was more than a year in the making, said the system “is flawed in key areas which the company must address.”
Meta asked the board to look into the system after The Wall Street Journal reported last year that it was being abused by many of its elite users, who were posting material that would result in penalties for ordinary people, including for harassment and incitement of violence.
Facebook’s rules reportedly didn’t seem to apply to some VIP users while others faced reviews of rule-breaking posts that never happened, according to the Journal article, which said the system had at least 5.8 million exempted users as of 2020.
The system — known as “XCheck,” or cross-check — was exposed in Facebook documents leaked by Frances Haugen, a former product manager turned whistleblower who captured worldwide headlines with revelations alleging that the social media company prioritized profits over online safety and galvanized regulators into cracking down on hate speech and misinformation.
Nick Clegg, Meta’s president for global affairs, tweeted that the company requested the review of the system “so that we can continue our work to improve the program.”
To fully address the board’s recommendations, “we’ve agreed to respond within 90 days,” he added.
The company has said cross-check, which applies to Facebook and Instagram, was designed to prevent “overpolicing,” or mistakenly removing content thought to be breaking the platform’s rules.
The Oversight Board’s report said that the cross-check system resulted in users being treated unequally and that it led to delays in taking down content that violated the rules because there were up to five separate checks. Decisions on average took more than five days, it found.
Read more: Meta brings Facebook Reels to Bangladesh
For content posted by American users, the average decision took 12 days, and for Afghanistan and Syria, it was 17 days. In some cases, it took a lot longer: one piece of content waited 222 days — more than seven months — for a decision, the report said, without providing further details.
Among its 32 recommendations, the board said Meta “should prioritize expression that is important for human rights, including expression which is of special public importance.”
Human rights defenders, advocates for marginalized communities, public officials and journalists should be given higher priority than others put on the cross-check list because they are business partners, such as big companies, political parties, musicians, celebrities and artists, the report said.
“If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection,” the board said.
Addressing other flaws, the board also urged Meta to remove or hide content while it’s being reviewed and said the company should “radically increase transparency around cross-check and how it operates,” such as outlining “clear, public criteria” on who gets to be on the list.
The board upheld Facebook’s decision to ban Trump last year out of concern he incited violence leading to the riot on the U.S. Capitol. But it said the company failed to mention the cross-check system in its request for a ruling. The company has until Jan. 7 to decide whether to let Trump back on.
Clegg said in a blog post that Meta has already been making changes to cross-check, including standardizing it so that it’s “run in a more consistent way,” opening up the system to content from all 3 billion Facebook users and holding annual reviews to verify its list of elite users and entities.
After widespread criticism that it failed to respond swiftly and effectively to misinformation, hate speech and harmful influence campaigns, Facebook set up the oversight panel as the ultimate referee of thorny content issues it faces. Members include a former Danish prime minister, the former editor-in-chief of British newspaper the Guardian, as well as legal scholars and human rights experts.
The board upheld Facebook’s decision to ban Trump last year out of concern he incited violence leading to the riot on the U.S. Capitol. But it said the company failed to mention the cross-check system in its request for a ruling. The company has until Jan. 7 to decide whether to let Trump back on.
Clegg said in a blog post that Meta has already been making changes to cross-check, including standardizing it so that it’s “run in a more consistent way,” opening up the system to content from all 3 billion Facebook users and holding annual reviews to verify its list of elite users and entities.
Read more: Meta contributes over Tk1.5 crore for Sitrang-hit people's rehabilitation efforts
After widespread criticism that it failed to respond swiftly and effectively to misinformation, hate speech and harmful influence campaigns, Facebook set up the oversight panel as the ultimate referee of thorny content issues it faces. Members include a former Danish prime minister, the former editor-in-chief of British newspaper the Guardian, as well as legal scholars and human rights experts.
Some critics have previously questioned the board’s independence and said its narrow content decisions seemed to distract from wider problems within Facebook and concerns about government regulation.
2 years ago