Source: Thought Catalog

In a world where knowledge and information are key to survival, the rapid dissemination of misinformation and disinformation yields hazardous results. More than two months after the storming of the US Capitol by a mob of supporters of former US President Donald Trump, Facebook has cracked down in Facebook Groups-popular forums for shared interests, which have been implicated in the viral spread of harmful misinformation and violating its neighborhood requirements, as well as trying to curb a product that played a high-profile role in the protests that led to the Capitol riot.

Facebook imposed new penalties on repeat offenders, groups, or individuals, warning that they would be removed should their conduct be egregious.

The company has been criticized much for insufficient actions to police discourse on its platform and censoring users. Even Facebook’s personal researchers agreed that the corporate’s oversight of the product was weak.

“Groups and members that violate our rules should have reduced privileges and reach, with restrictions getting more severe as they accrue more violations until we remove them completely,” Tom Alison, vice president of engineering, said in a blog post. “And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between.”

Facebook in recent times acknowledged that a few of its largest civic-based Teams have been poisonous and they were alarmed by its progress. “70% of the highest 100 most lively US Civic Teams are thought of non-recommendable for points akin to hate, misinformation, bullying, and harassment,” it said in a presentation.

Toxic content in large and influential groups like “Stop the Steal” helped in organizing the Capitol riot, while anti-vaccine groups used to sow doubt over the safety and effectiveness of the COVID-19 vaccine. This, as results have shown, has gone viral with real-world consequences, most notably, the Capitol riot. High teams functioned much less as communities than as megaphones for partisan publishers and purveyors of “hate bait,” racially and politically charged content material meant to elicit requires violence,” Facebook said.

Facebook Groups was originally designed in 2017 to be private or public communal areas where people could gather over common interests or challenges. Today, it has a large user base, with more than half of 2.8 billion Facebook users belonging to at least five groups.

From now, Facebook will penalize groups that accrue strikes for breaking platform guidelines in opposition to misinformation, hate speech, and different varieties of content material deemed dangerous by the platform. Facebook will also limit notifications so people are less likely to join these groups and new members will be warned that the group has been flagged for violations when they join. Content from the group will be pushed down in News Feed for existing members.

Other rules include the temporary approval of administrators or moderators of groups with a large number of members who have broken the rules or who were part of other groups that were removed for breaking the rules. Facebook may also prohibit members of such teams from inviting their associates to affix them and block directors from forming new teams if their present ones have been flagged for dangerous conduct. It will also not recommend political and health-related groups to countries outside of the U.S. It currently only restricts such recommendations to U.S. users.

“This means that content won’t be shown to the wider group until an admin or moderator reviews and approves it. If an admin or moderator repeatedly approves content that breaks our rules, we’ll take the entire group down,” Alison said. “These measures are intended to help slow down the reach of those looking to use our platform for harmful purposes and build on existing restrictions we’ve put in place over the last year,” he added.

Facebook is not the only one to take measures against the spread of misinformation on its platform. Twitter has also been penalizing customers who submit deceptive information on matters just like the Covid-19 vaccine and election integrity.