Facebook is trying it’s best to curb the spread of hate speech and other related content in countries experiencing conflicts. The social media giant has brought up new measures to remove content that encourages hate speech by limiting the number of forwards on it’s messenger. This has been implemented in two countries – Sri Lanka and Myanmar.
The company made sure a person could forward a particular message only a certain number of times. The limit is currently set to five people. This is very similar to the change brought about in WhatsApp earlier this year to reduce forwarded messages around the globe. The limit for message forwards in other countries is 20, unlike in India where a person can forward a message to a maximum of 4 people.
Sri Lanka blocked access to the social network, as well as two other platforms that Facebook owns, WhatsApp and Instagram, due to usage of these platforms to stem mob violence directed at the Muslim minority. The platforms were used to spread and amplify hate among the communities on the island country. Similar happenings were noticed in Myanmar, where Facebook was sought as means to stoke violence against the Rohingya ethnic group.
In Myanmar, Facebook has begun to reduce the distribution of all content shared by people who have demonstrated a pattern of posting content that violates its Community Standards. If this method proves to be successful,the social media giant may also roll out the same in other countries as well.
“By limiting visibility in this way, we hope to mitigate against the risk of offline harm and violence,” said Facebook’s Samidh Chakrabarti, director of product management and civic integrity, and Rosa Birch, director of strategic response.
They further added saying “In cases where individuals or organisations more directly promote or engage violence, we will ban them under our policy against dangerous individuals and organisations.”
The company said it is using AI to detect abusive speech by adding any graphics that violate its policies to a photo bank so they can be automatically deleted when they crop up in similar posts.
Facebook has also banned a number of armed groups in Myanmar for having violating it’s terms of service. The company is trying to tackle misinformation and hate speech in a number of countries including India, the Philippines, and Indonesia.