Facebook
Source: Thought Catalog

Facebook is back in news, and it is back to back in news for repeated content moderation news. This time, The Verge has reported a potential bug in Facebook’s main platform (now fixed), which severely affected it’s downranking system — a system that keeps misinformation and harmful content away from your news feeds. This led to promotion of misinformation, nudity and Russian state media (which the company pledged not to promote) on users’ news feed.

A group of company’s engineers identified the bug, referring to it as a “massive ranking failure”, which left half users exposed to “integrity risks”. According to an internal report, the scale of the error spanned to about half of all news feed views, over a period of six months.

The issue first came to the notice of Facebook engineers in October. A surge of misinformation was observed appearing on news feed. The news feed was observed promoting posts from repeat misinformation offenders and flagged by independent fact-checkers, instead of suppressing them. As the engineers could not pin-point the root cause of the bug, they let the bug run its course till they noticed the trend subside in a few weeks. Similar spikes were repeatedly observed until March 11th, when the bug was fixed.

The bug spanned a lot wider than misinformation, as the ranking system failed to flag violence, Probable nudity, and Russian state media, which the company pledged not to promote. This was in response to the recent geopolitical disturbances between Russia and Ukraine. The issue was assigned as a Level-1 site event (SEV-1, Crisis of highest priorit, must be solved as soon as possible).

Meta Platforms Inc. (Formerly Facebook Inc.) spokesperson Joe Osborne, confirmed the incident in the following official statement to The Verge.


Internal reports say the issue initially appeared in 2019, but did not make a significant impact on content metrics till October 2021. Facebook leadership, meanwhile, has stuck to their pride in their AI systems, and their proactive approach at flagging inappropriate content. Despite recent events, Facebook has been discrete about how their downranking system impacts news feed content.

CEO Mark Zuckerberg in 2018, explained that downranking helps deter a user’s tendency to engage with “More sensationalist and provocative” content.

According to Former member of Facebook’s Civic integrity team, Sahar Massachi, there is “no indication that there was malicious intent behind this recent ranking bug that impacted up to half of News Feed views over a period of months, and thankfully, it didn’t break Facebook’s other moderation tools. But the incident shows why more transparency is needed in internet platforms and the algorithms they use”

Despite the highly qualified opinion of Mr. Massachi (Who is now co-founder of the non-profit Integrity Institute), any facebook user’s attention would be drawn towards, what looks like a borderline ignorant response to a serious situation. One has to ponder, for a platform that is now an inherent thread of humanity’s social fabric, for a platform that has the power to influence behaviour at an unprecedented scale, How much accountability is too much accountability?