Facebook

Facebook, in light of massive criticism surrounding its content policy, decided to offload the task to an Oversight Board. While this was done to take some pressure off of the social media platform, it looks like the plan might have backfired on the Mark Zuckerberg led company. According to the board’s first transparency board, Facebook has not been forthcoming with users about why their content is being removed. Moreover, it also says that even the board itself has been left in the dark in some cases.

The report is a culmination of user requests over the fourth quarter of 2020, and the first two quarters of 2021. During this period, the board argues, Facebook did not has not been fully forthcoming with the board on its “cross-check” system. This system is what Facebook uses to review content decisions relating to high-profile users.

However, the company has agreed to let the board review its system and make recommendations on how it can be improved.

Over these three quarters, Facebook and Instagram users submitted around 524,000 cases to the board. Moreover, the report says that users reported more and more cases every quarter-there were about 114,000 cases in the fourth quarter of 2020, 203,000 cases in the first quarter of 2021, and around 207,000 cases in the second quarter of 2021.

Geographically, American and Canadian users submitted the most cases, making up for 46% of the total reports. This was followed by Europe with 22%, and Europe and  Latin America and the Caribbean with 16% cases. Asia Pacific Oceania region made up for 6% of the cases, Middle East and North Africa 4%, and Central and South Asia as well as Sub-Saharan Africa sitting at 2%.

“We do not believe this represents the actual distribution of Facebook content issues around the globe. If anything, we have reason to believe that users in Asia, Sub-Saharan Africa, and the Middle East experience more, not fewer, problems with Facebook than parts of the world with more appeals,” the board said.

36% of the total cases were concerning hate speech, 31% were about bullying and harassment, and 13% were violence and incitement. Adult nudity and sexual activity (9%) and dangerous individuals and organizations (6%) making up most of the remaining cases.

This comes at a time where Facebook is desperately looking for a fresh start and might even be looking to rebrand in line with the new metaverse vision.