Facebook has offered justification in response to its removal of the video of Philando Castile’s death. Along with appeasing a section of the populace who were highly critical of the removal of the video which documented the immediate minutes which followed the shooting, Facebook has also shedded more light on it content removal policies and the modus operandi of its content review teams, particularly with respect to live videos.
According to Facebook, the video of Philando Castile’s death was in fact taken down, however, it was due to a technical glitch and not because of any other reason. This also puts paid to all those theories which were putting the video removal as due to Facebook’s classification of the video as too violent, a very high volume of reports or a police request for the same.
In case you are unaware of the case, the video was recorded and broadcast by Diamond Reynolds after her partner Philando Castile was shot dead by a police officer after they were stopped for a broken tail light.The video as can be expected, is pretty disturbing and also has Reynold’s four year old daughter, who was in the backseat at the time.
The video was shot during a traffic stop in Falcon Heights, Minnesota at about 9 p.m. Wednesday night, July 6. It has since then generated millions of views and has playef an important role in fermenting public unrest over the matter. So when Facebook suddenly took it down for a while, questions were raised over Facebook’s standards.
Meanwhile, the video has been restored with a graphic content disclaimer. Facebook however, has offered no explanation of why the video was removed, and has refused to comment on speculation that the audden removal and restoration was the result of a traffic spike or multiple reports. What the social networking giant has instead offered, are a set of community guidelines specifying how content, live videos and otherwise, shall be dealt with.
So here is the meat of the matter. Apart from the nature of the content, a lot also depends upon how and what is being protrayed through the video. So for example, a video which shows violence for what it is and opposes it, is likely to stay up whereas a video that glorifies violence and is to the tune of hey this is great, more stuff like this should happen or do this, is very likely to be taken down.
Another important bit that came out into the open is the fact that the number of reports have no bearing upon wheather something is checked or not. A single report will trigger the same action action as a thousand.
The number of reports does not impact whether something will be removed. We never remove content simply because it has been reported a number of times.
Upon the use of Facebook live, Facebook said that Live was ruled by the same standards as any of Facebook’s other content and the context of the content was the most important Factor in deciding what happened to it.
For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebratethe shooting, we would remove the video.
Once a live video is reported, Facebook’s content review teams which are active 24*7, around the year, is activated and tries to determine the nature of the content. Depending upon what it finds, it can either take the video down, paste a disclaimer warning users about graphic or violent content or leave it as it is.
In case of the disclaimer, minors are kept from watching the video and it doesn’t autoplay on the news feed. Meanwhile, even if a video isn’t reported, Facebook monitors its content in case it has reached a high enough viewership.
Speaking to TechCrunch on the recent controversy, a Facebook spokesperson said,
We’re very sorry that the video was temporarily inaccessible. It was down due to a technical glitch,and restored as soon as we were able to investigate.We can confirm it was streaming live on Facebook. A couple hours after, it was down for about an hour.
However, it has still refrained from going into any of the details of the glitch.
Although Facebook’s policies appear to be sound enough — there are no unbalanced restrictions, there is a team of live humans to monitor stuff and there is no stupid algorithm to mess things up — it really should ensure that the “Temporary Glitches” should be kept to a minimum.
Not only do these unexplained glitches take away Facebook’s — both as a corporation and as a news publisher — credibility as a fair distibutor, but also take away a lot of power given to the public through the ability to record, document and broadcast stuff in real time.