In October 2021, Facebook-parent Meta had asked its Oversight Board to review its cross-check program and make recommendations on how it can be improved. More than a year later, the Board has released its public advisory opinion (PAO), wherein it revealed that the current cross-check program benefits Meta’s business interests more than it benefits its users. Hardly a surprise?

Meta views the cross-check program to be a second layer of review, and since it is a tall order across its platforms, the program is a good idea, in theory. This is further strengthened by the fact that it was performing about 100 million enforcement attempts on content daily, Meta shared to the Board last year. It lends credibility to the assumption that Meta might, at times, mistakenly removes content that does not violate its policies. This is what the cross-check review system intends to tackle by providing additional layers of human review for the posts.

In practice, however, the system leaves a lot to be desired. In its in-depth report, the Board – which operates independently – revealed that it found “several shortcomings in Meta’s cross-check programme,” such as providing more protection to certain users – who are selected largely according to business interests. This prioritization of select users allows for the retention of content which would otherwise be removed from the platform and “potentially” cause harm.

As per a report by The Wall Street Journal, this list of select users includes a varied collection of politicians, celebrities, advertisers, health organizations and news publishers. Some of them are former US President Donald Trump, his son Donald Trump Jr., Democratic Senator Elizabeth Warren, conservative activist Candace Owens, Brazilian footballer Neymar, and Meta CEO Mark Zuckerberg.

These users, under the current cross-check system, are eligible for a further review of their posts – if any are alleged to violate Meta’s policies towards violence, hate speech, misinformation, nudity and others, and even exempted from Meta’s rules in some cases. Considering that it is evident that Meta favours these high-profile individuals, and some of them are notorious enough to get banned on platforms like Twitter, the type of content that these users can post on Facebook or Instagram and get away with it is concerning.

Cross-check “prioritizes users of commercial value to Meta and as structured does not meet Meta’s human rights responsibilities and company values,” Oversight Board director Thomas Hughes said in a statement. The Board also discovered that Meta had failed to track metrics on whether cross-check resulted in more accurate decisions, and “expressed concern about the lack of transparency around the programme.”

In response, Meta asked how it should “balance its desire to fairly and objectively apply our Community Standards with our need for flexibility, nuance, and context-specific decisions within cross-check,” alongside fairly enforcing its Community Standards. This should come alongside “minimising the potential for over-enforcement, retaining business flexibility, and promoting transparency in the review process.” Additionally, it asked for guidance on what criteria it should use to determine who is included in the secondary review and prioritised. The company added that it will consider and respond to the Board’s recommendations within 30 days.