Yahoo

Machine learning is now becoming ubiquitous with each passing day. And Yahoo is today joining the bandwagon by open-sourcing quite an important model, required to keep the internet clean and safe for work.

Known as the open NSFW model, the deep neural network is used to filter pornographic and not safe/suitable for work content off the interwebs. It can automatically identify if an image includes offensive or adult images and filter them out of the loop using its advanced computer vision techniques.

The neural network has been trained by making it adapt and predict outcomes based on lots of new and updated nsfw-pornographic data. It can now automatically detect nsfw images — just images — with greater precision. Though the model has been trained to detect nsfw data, it still cannot detect sketches, cartoons, text, images of graphic violence, or other types of unsuitable content.

Defining NSFW material is subjective and the task of identifying these images is non-trivial. Moreover, what may be objectionable in one context can be suitable in another. For this reason, the model we describe below focuses only on one type of NSFW content: pornographic images,

reads the research blogpost.

The open nsfw model has been built using convolutional neural networks to improve the accuracy of the image classification. It works with the widely used Caffe open-source deep learning framework, and trained using CaffeonSpark model, which takes an image as input and outputs a probability(i.e a score between 0-1) to detect and filter them. The dataset has further been trained to define nsfw images as positive and sfw pictures as negative.

yahoo-nsfw-model
Credits: Github

This project, as Yahoo puts it, is the only known open-source model for detecting nsfw images on the interwebs and thus, it has now been released on Github under a BSD 2-Clause license. This means that users cannot replicate the model but only make minor adjustments to it if they want to. The company is allowing developers to play with a classifier for NSFW detection and provide feedback for its improvement.

While Yahoo has open-sourced a very important model that’ll help police content on the Internet, it has also added a disclimer at the very end stating:

This model is a general purpose reference model, which can be used for the preliminary filtering of pornographic images. We do not provide guarantees of accuracy of output, rather we make this available for developers to explore and enhance as an open source project.

This model could be very helpful not only for other search engines like Google and Bing, but also for photo and video sharing platform such as Instagram and Pinterest. So, if you’re keen on trying out and contribute to the project, head to the Github page right here.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.