On Facebook and Instagram, there are rules about what can and cannot be posted.
They may change from time to time, just like the way humans and robot moderators perform them. But in theory, the rules are the same for each of these sites’ nearly 5 billion users.
Unless, that is, you happen to be a politician, celebrity or business associate Facebook and Instagram’s parent company Yuan.
Their posts, and those of some 5.8 million other influential users, are delivered through a special VIP channel called Crosscheck, which gives them extra leeway to break Meta’s rules.
Waivers can be important. If a regular user’s post is flagged by an automated moderation system, it will be deleted immediately.
If a VIP’s post is flagged, it keeps showing, and a human moderator looks at it again (even three, four, or five times).
For example, in September 2011, Brazilian soccer player Neymar posted intimate photos with others on his Facebook and posted instagram Accounts without the consent of the parties were reported.
The video clearly violates Meta’s content policy, which prohibits many relatively benign forms of nudity. However, it remained online for more than a day, racking up 56 million views before being removed, according to The Guardian.
Reason for delay? Neymar, who later announced a business deal with Meta to promote Facebook Gaming, was on the cross-check list and was grappling with a backlog.
This delay, which lasted an average of five days, rose to 12 days in the US and 17 days in Syria, and was one of several areas of cross-examination harshly criticized by Meta’s oversight board, a semi-independent internal “court”. ’ was set up by Mark Zuckerberg to advise on difficult moderation issues.
The board has been scrutinizing the plan since last year, when whistleblower Frances Haugen revealed the scale of the system by leaking internal company documents to the Wall Street Journal.
In a report published on Tuesday, the board called on Meta to overhaul the program, saying it “prioritizes users with commercial value” over its “human rights responsibilities and corporate values.”
Oversight board director Thomas Hughes told Sky News the system had caused “real harm”. Instead of calling for the system to be dismantled, however, he said “you do need some sort of secondary review process”.
The board called on Meta to overhaul Crosscheck to make the process faster and more transparent, and refocus it on human rights-related issues, such as the accidental deletion of news material.
Mark Zuckerberg faces a long, painful road to Metaverse success
Facebook defends itself after handing over chat messages to US police investigating abortion
It said Meta should develop clear criteria for participation in cross-checking and publicly flagging accounts, especially state actors or business partners, for inclusion in the system. Currently, even those cross-checked don’t know they’re listed.
Meta tends to be lax in enforcing its rules to avoid creating an “impression of censorship” or provoking “public controversy” for business partners, especially those who might cause trouble for Meta’s senior executives, the report said.
However, to avoid damaging delays, the Board recommends that content marked as “high severity” during the first review should be removed upon reassessment.
Meta does not have to follow the board’s recommendations and has declined to do so on several key occasions, although Mr Hughes said the company was inclined to implement most of them. In this case, there are 32 of them.
“They won’t implement all of them, but given the rate of implementation so far, I think they will implement most of them,” Mr Hughes said. “The board believes the recommendations are achievable.”
However, despite calling on Meta to “fundamentally increase the transparency of the cross-examination”, the board itself has struggled to achieve full transparency, with many key details missing from its report.
Despite “repeated inquiries,” the board has yet to find out who is on the cross-check list. It could not confirm the exact number of people on the list or obtain detailed examples of cross-checked posts.
“Such limited disclosure impairs the board’s ability to discharge its statutory oversight responsibilities,” the board complained in its report.
The board previously said Meta was “not completely candid” about cross-checking, failing to mention plans related to President Trump, and then saying it was small when in reality it involved millions of users.
However, while whistleblower Ms Haugen accused Meta of “repeatedly lying” about the scheme, Mr Hughes disagreed, saying he believed the information the board had been given was “accurate” and “satisfactory” and that the board had “demonstrated Its power” inquiry procedure.
Critics argue that Meta’s potential problems are too large for the oversight board to fix, since implementing their most substantive recommendations would require the company to hire tens of thousands more human moderators, especially in countries outside the U.S. and Canada .
Although the two countries accounted for only 9% of monthly active users, they accounted for 42% of the content cross-checked, the commission found.
“The Haugen papers show a picture of systemic inequality, with the U.S., for all its regulation problems, getting the lion’s share of regulation resources while virtually everywhere else gets essentially nothing,” said Cori, director of Foxglove suing Meta. Crider said on behalf of former Facebook content moderator Daniel Motaung.
“Until that imbalance is corrected, I don’t see much of a difference in the opinion of the oversight board.”