Facebook has come under fire again for failing to remove child exploitation imagery from its platform. A news report which is the result of a BBC investigation, has claimed that the social networking platform has failed in its responsibility to remove such imagery from its platform.
This new investigation builds upon a previous one from the news channel. Last year, BBC reported that closed Facebook groups were being used to share images of child exploitation. Facebook had committed to introduce improvements back then and its head of public policy had gone on to elaborate about his own commitment to removing such content from the platform. Following the hue and cry raised in wake of the report, Facebook had said that it had managed to bring improvements to its reporting system.
However, BBC begs to differ. In an article published today, the news organization has reported that content that depicts child exploitation is still doing the rounds on the social networking platform and indeed, Facebook failed to remove the vast majority of content even after it was reported.
Apparently, BBC deployed the report button provided by Facebook to report as many as 100 images that were against the company policy on what is and what isn’t permitted on the social networking platform. However, only 18 were taken down. What’s more, BBC also found five convicted pedophiles with a presence on the social networking platform. It reported them to Facebook as well in hopes that the account would be quickly taken down — since Facebook policy prevents convicted sex offenders from having accounts on the website.
However, none of the accounts were taken down, casting a worrying pall over the company’s ability to conform to the rules it professes to have introduced to ensure the safety of its users. Facebook is as common amongst young teens today as it is among the adults and often, the former use their accounts without any form of adult supervision. As such, it is seriously worrying that pedophile convicts can have a presence on the platform despite being reported.
Reflecting those doubts, the chairman of the UK House of Commons’ media committee, Damian Collins, told the BBC:
I think it raises the question of how can users make effective complaints to Facebook about content that is disturbing, shouldn’t be on the site, and have confidence that that will be acted upon.
What’s more, Facebook then went on to report BBC, after the former asked the latter to send samples of the kind of content that was doing the rounds of the web. Apparently, the circulation and distribution of such content is illegal in the UK and needs to be reported — as per CEOP guideline.
Meanwhile, Facebook says that it has now removed all the reported content that was also against the company’s standards. It also said that it was constantly working to improve its reporting and moderation measures. In the long term though, Facebook is working to introduce a series of AI-powered content flagging systems that it hopes will be able to bring an manifold improvement to the quality of content available on the platform. However, such systems are still several years in the future