Google Youtube

It is said that the only thing that spreads faster than news is fake news. We have seen this come true during the pandemic as social media (especially WhatsApp) became the primary carrier of fake news and misinformation, spreading it to hundreds. This has prompted companies to clamp down on the spread of misinformation, and in pursuit of that goal, Google’s video streaming service YouTube revealed that it had removed no less than a million videos that were related to “dangerous coronavirus information, like false cures or claims of a hoax.”

Misinformation on COVID-19 and vaccines was everywhere on social media until recently, and even now, the platforms are not free of this problem. This has earned social media companies the ire of political leaders for failing to stem the spread of false and harmful misinformation and disinformation. In a blog post, YouTube’s Chief Product Officer Neal Mohan said that they relied on “expert consensus from health organizations,” including the CDC and the World Health Organization, but noted that, in some cases, “misinformation is less clear-cut” as new facts emerged.

“Our policies center on the removal of any videos that can directly lead to egregious real-world harm,” he wrote.

Mohan was correct in his assessment that misinformation had finally come to the mainstream and was present in every sector of society, “sometimes tearing through communities with blistering speed.” YouTube often uproots the problems from its very roots, removing almost 10 million videos each quarter, most of which do not even reach 10 views.

If you are worried that your videos on YouTube are going to be taken down, rest assured that “bad content” accounted for only a small percentage (about 16-18%) of the vast multitude of videos on YouTube.

For now, YouTube is also working on speeding up the process for removing videos that contain misinformation while simultaneously delivering those from authoritative sources. Not only will this stem the flow of fake news, but it will also give people access to the reality of the situation and help them make informed decisions.

Social media is both a rallying point and a forum for people to air their views and opinions, and when these views spread inaccurate information on one of the burning issues of the day, there are bound to be consequences. Since the 2020 Presidential election, for example, YouTube claims to have removed “thousands” of videos for violating its election-related policies, with three-fourths removed before hitting 100 views, removing content with false claims that widespread fraud changed the outcome of any past U.S. presidential election.

While removal of content is necessary, there is a need for judicious use of this practice, since an overly aggressive approach towards removals could send a message that controversial ideas were unacceptable, thereby restricting the fundamental right of freedom of speech and expression.