With Donald Trump returning to the White House, tech companies are taking accelerated steps to ensure alignment with the new regime, or risk being thrown to oblivion under a rather impulsive Trump. Meta — the parent of Instagram, Threads, Facebook — is taking perhaps one of the most radical of steps.
In a video message, Mark Zuckerberg said that Meta is taking a page out of X’s book and altering how its social media platforms, including Facebook, Instagram, and Threads, handle user-generated content. The company, under the leadership of CEO Mark Zuckerberg, would “get rid of factcheckers and replace them with community notes similar to X”.
The fact-checking program has been part of Meta’s family of apps for nearly a decade – it was initially introduced after the 2016 US elections and aimed to address concerns about misinformation spreading on social media. However, Meta’s executives have expressed dissatisfaction with the outcomes of this system, citing instances of bias and errors. In addition to this, the third-party fact-checking program often faced criticism from users who felt their posts were wrongfully flagged or removed.
“In recent years we’ve developed increasingly complex systems to manage content across our platforms, partly in response to societal and political pressure to moderate content. This approach has gone too far. As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable. Too much harmless content gets censored, too many people find themselves wrongly locked up in “Facebook jail,” and we are often too slow to respond when they do,” Joey Kaplan, Chief Global Affairs Officer at Meta, announced in an official statement. “We want to fix that and return to that fundamental commitment to free expression. Today, we’re making some changes to stay true to that ideal.”
In place of the third-party program, Meta is introducing a model known as “Community Notes,” which will allow users to flag and add context to content deemed misleading. This system, akin to the one employed by Elon Musk’s X (formerly Twitter), is intended to reduce reliance on external fact-checkers. Instead, users will have a greater role in shaping the platform’s content moderation processes. User will write and rate the Community Notes, and the feature will roll out to users in the US over the next few months.
In addition to this, Meta’s moderation efforts will mostly focus on high-severity violations, such as content related to terrorism, child exploitation, and illegal drug trafficking. Under this revised system, content related to political or social debates will not be subject to the same level of scrutiny unless users actively report it as harmful. So users can discourse on a wide variety of topics (Meta provides examples like immigration, gender identity and gender) without fear of their content being scrutinized. The company will also stop demoting fact-checked content and replace it with a more straightforward label indicating the presence of additional information for the content.