‘Revenge Porn’ has been an extremely important issue when ti comes to social media. Considering how almost everyone you know would have a profile on some or the other social media platforms, it is important that users are protected from all forms of revenge porn.
In lieu of the same, Facebook has finally introduced a pilot program to test and update their intimate image detecting system. The new image detection system has been updated through the implementation of advanced artificial intelligence technology. The social media giant has developed its new detection technology after consulting both victims and experts in various situation.
Facebook launched the pilot program to tackle the problem of ‘revenge porn’. The aim is to delete non-consensual intimate images before they are even reported by users to improve user privacy and experience. The filter has also been implemented in Facebook’s photo sharing based social network, Instagram.
The recent leaking of inappropriate images and non-consensual intimate images has been a major problem that has plagued the internet in recent years. Facebook has also come under heavy scrutiny for not being able to take quick decisive actions. The new development to the image filter would allow them to preserve the platform’s content from any inappropriate images.
“We are thrilled to see the pilot expand to incorporate more women’s safety organizations around the world, as many of the requests that we receive are from victims who reside outside of the US,” said Holly Jacobs, Founder of the Cyber Civil Rights Initiative.
The new AI-based filter would flag inappropriate content and take the post down before it is reviewed by a specially-trained member of Facebook’s Community Operations team to check if the post is save to be viewed. The pilot program also allows users to submit particular photos on Facebook that they don’t want to be shared without worrying about security issues, as an emergency option.
Facebook has made this new development available in Australia as of now.