After reviewing the contentious cross-check method used by Facebook and Instagram, Meta’s Oversight Board has produced a report recommending that the company make the system “radically” more transparent and increase the program’s resources.
The cross-check feature, which creates a separate moderation queue for prominent public figures like former president Donald Trump before his suspension from Facebook, has been criticized for “many deficiencies” by Facebook’s semi-independent Oversight Board. In addition to incidents where rule-breaking content, including one instance of non-consensual pornography, was kept up for an extended length of time, the report called out a failure to make apparent when accounts are protected by special cross-check status. Also, Meta was criticized for not retaining moderation statistics that might be used to evaluate the reliability of the program’s output.
While Meta assured the board that cross-check was designed to further the company’s human rights objectives, the audit noted that the program seemed more directly geared to fulfill business concerns. The board recognizes that Meta is a commercial enterprise, but they are concerned that cross-check, which gives preferential treatment to some users chosen primarily with commercial interests in mind, permits harmful information to remain online for longer than it would otherwise.
It was shielding a select group of individuals from harm, many of whom were unaware that they were included.
The Wall Street Journal first reported on cross-check publicly almost a year before this article. Meta requested an evaluation of the program from the Oversight Board in the wake of its findings, but the board declined, claiming that Meta had withheld crucial information such as how it handled Trump’s post moderation. There has allegedly been a lot of back-and-forth between Meta and the Oversight Board leading up to today’s statement, including the study of “thousands” of pages of internal papers, four briefings from the business, and the request for responses to 74 questions. The final report has charts, figures, and quotes from Meta that provide light on the company’s method for coordinating an exhaustive review process.
Oversight Board member Alan Rusbridger tells The Verge, “It’s a little portion of what Meta does, but I think that by spending this amount of time and delving into this [much] detail, it highlighted something that’s a bit more systemic inside the corporation.” I have faith that many at Meta share my commitment to the ideals of free expression, press freedom, and the safety of those engaged in civil society. However, the program they had developed wasn’t carrying out the aforementioned actions. There were only a few people it was actually protecting, and they had no idea they were on the list.
Cross-check is an effort to reduce the reliance on automated systems for moderation by submitting moderation judgments to a group of human reviewers. Journalists covering war zones and public figures whose comments make headlines are among its members (albeit they aren’t informed of this protection, as Rusbringer points out). Publishing houses, performers, corporations, and non-profits are all considered “business partners” in this context.
The article cites remarks from Meta suggesting that the program prefers under-enforcing the company’s policies in order to prevent the “appearance of censorship” or a poor experience for those who contribute big funds and users to Facebook and Instagram. Meta claims that the average time it takes to make a decision on a piece of material is over five days. Sometimes, choices are delayed even further due to a backlog of content awaiting moderation; in one extreme case, a piece of content waited in the line for nearly seven months.
The Oversight Board has been vocal in its criticism of Meta’s removal of politically or artistically charged content. In this specific instance, though, it voiced worry that Meta was prioritizing its corporate connections over the safety of its users. When Brazilian soccer player Neymar shared nude images of a woman who accused him of rape, a judgment was delayed because of a cross-check backlog. Despite the evident breach of Meta’s rules, Neymar’s account was not canceled. The committee is aware that Neymar has an exclusive contract with Meta for streaming his content.
However, because of their enormous size, regular users of Facebook and Instagram don’t receive the same level of human moderation, which is a major source of the issue. In October of 2021, Meta reported to the Oversight Board that it was carrying out 100 million content enforcement actions per day. Given the sheer amount involved, it would be extremely challenging, if not impossible, for a human-powered moderation system to coordinate all of these choices. However, the board has said that it is unclear whether or whether Meta tracks or seeks to examine the accuracy of the cross-check method in comparison to standard content moderation. If this occurred, it may show that Meta is lax in enforcing its regulations against prominent individuals, or that a large amount of information posted by regular users is being falsely reported as illegal.
“My hope is that Meta will retain its nerve.”
The advisory council gave Meta 32 suggestions. (Meta, as usual, has 90 days to reply to the recommendations but is not obligated to implement them.) The guidelines suggest removing posts from view, even if they were submitted by business partners, that have been labeled as “high severity” infractions until an investigation is conducted. The board has requested that Meta create a separate queue from the one used for material from Meta’s commercial partners in order to improve content filtering for “expression that is crucial for human rights.” It requires Meta to establish “clear, public criteria” for who is included on cross-check lists and to publicly mark that status for certain groups, such as state actors and business partners.
Publicly marking accounts is one policy suggestion that might be implemented with relatively few additional resources being allocated. However, Rusbridger admits that some would require a “significant” growth of Meta’s moderation team, such as clearing up the backlog for cross-check. In addition, this news comes at a time of austerity for Meta, as the company fired about 13% of its workforce earlier this month.
As Meta tightens its belt, Rusbridger hopes that content moderation will continue to be a top priority alongside “harder” technological developments. He has confidence that Meta will not back down. To paraphrase, “as tempting as it is to sort of cut the’soft’ areas, I think in the long term, they must realize that’s not a very wise thing to do.”