Internet News

Facebook wants its users to control the amount of nudity and violence they see on its platform

Facebook advertising, Facebook
Share on Facebook
Tweet about this on TwitterShare on Google+Share on StumbleUponShare on LinkedInPin on PinterestShare on Reddit

Facebook is a pretty great place. You get to meet new people, stay in touch with the old ones and so on and so forth. However, over the years it has evolved and the content present on the platform now embraces a spectrum that has a good end and a bad end. Facebook wants its users to see only as much of the bad side as they are comfortable with and towards the same, is planning customizable filters for nudity, profanity and violence.

At present, Facebook uses a standard review system in which content that gets reported too many times attracts the attention of its review team, who then proceed to take the said content down if they find it to be offensive, racially prejudiced or in violation of local laws in some way. The system works, however, there are various cracks in it as well. For instance, the company had to restore an iconic image from the Vietnam war after is was taken down because it contained nudity.

Facebook is now looking to avoid issues like these by leaving the choice to rest with the user himself/herself. As CEO Mark Zuckerberg mentions in his recently released humanity manifesto:

The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.

With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allow.

So if you are cool with whatever the rest of the people in your region are fine with, great. You will see content that is curated on the basis of the preferred setting of the majority in your region. However, if you decide that you would like a little less nudity or less profanity, or if you figure that you would be okay with some more violence — well, Facebook allows you to update your settings so that the changes take effect as per your requirements.

The system is clearly still in the works. Reading between the lines, it appears as the bulk of this system would be powered by Artificial Intelligence. And indeed, the task might just prove to be too much for human curators. After all, sorting through content based on the individual preferences of the 1.8 billion+ Facebook users would not be easy.

There would be other issues as well. For instance, who gets to decide what teens or kids are watching? Does Facebook decide their policy for them? Or does it engage with dialogue with their elders to decide? Or does a universal standard apply to them until they come of age?

That said though, the changes will allow Facebook users to have a lot more control over the kind of content they want to see. In the long run, it could well serve to make Facebook a better place for everyone.

A bibliophile and a business enthusiast.

[email protected]

Add Comment

Click here to post a comment

Your email address will not be published. Required fields are marked *