By the way, before we begin, the title is sarcastic, for those of us who are Sheldon Cooper.

We have all heard that social media websites are bad, but the topic has never been as vividly and extensively discussed as on the Netflix Documentary–Social Dilemma. The informational program raised lots of allegations against various social media giants, including Facebook, which has now come up to defend itself in a rather rare and unwanted public rebuttal against the documentary. Facebook says, that it wants to tell you what the documentary did not.

According to Facebook, the allegation that its apps tend to profit off of addiction is not true. The company claims that it aims to ‘create value’, and not just drive usage. It backs up this claim by saying that it offers people the chance to control how they use its products, which is why it provides time management tools like an activity dashboard, a daily reminder, and ways to limit notifications.

Moreover, it counteracts the claim by saying that “In 2018 we changed our ranking for News Feed to prioritize meaningful social interactions and deprioritize things like viral videos. The change led to a decrease of 50M hours a day worth of time spent on
Facebook. That isn’t the kind of thing you do if you are simply trying to drive people to use your services more.”

One of the most scary lines from the documentary, if you have seen it, was that “You are the product.” However, Facebook argues that the claim was misleading, and that when businesses purchase ads on Facebook, they don’t know who you are. The company goes on to say that it does in fact provide advertisers with reports about the kinds of people who are seeing their ads and how their ads are performing. However, this information is not enough to personally identify a user, unless you give the permission. 

Thus, the company says that it does not sell user information.

Moreover, the company also rebutted the allegation that pushes polarizing content to its users. It said, “The overwhelming majority of content that people see on Facebook
is not polarizing or even political—it’s everyday content from people’s friends and family.”

The company says that it reduces the amount of content on its platform that could drive polarization, including links to clickbait headlines or misinformation.

It also addressed claims about Election frauds, its algorithm and most importantly, misinformation, using some very corporate sounding language. The company claims that it has worked to better itself after the 2016 election fiasco, even though it has had a tough track record with politics this year. It says that its algorithm is just like every other algorithm employed on every other social media platform. And it added that it does not allow misinformation that has the potential to contribute to imminent violence, physical harm, and voter suppression to stay on its app.