Yahoo open sources its NSFW-detecting neural network

Yahoo's Caffe deep neural network model (Github code) will primarily look at pornographic images only.

By: Tech Desk | Updated: October 2, 2016 4:40 pm
Yahoo, Yahoo porn detecting model, yahoo open source porn detecting model, porn, nsfw, not suitable for work, nsfw images, nsfw content, yahoo new porn detecting model, social media, technology, technology news Yahoo’s Caffe deep neural network model (Github code) will primarily look at pornographic images only.

Yahoo has open-sourced its NSFW (not suitable/safe for work) detecting neural network, allowing developers to work towards improving the algorithm. Jay Mahadeokar and Gerry Pesavento of Yahoo explained in a blogpost that defining NSFW material on Internet is subjective and identifying such images is non-trivial. “Since images and user-generated content dominate the Internet today, filtering NSFW images becomes an essential component of Web and mobile applications,” they said.

While algorithms automatically classify NSFW content to a great extent; Yahoo’s Caffe deep neural network model (Github code) will primarily look at pornographic images only. Jay and Gerry say identification of NSFW sketches, cartoons, text, images of graphic violence, or other types of unsuitable content is not addressed with this model.

Yahoo’s model will allow developers to experiment with a classifier for NSFW detection, and they can provide their feedback to make the whole system better. The model provides a score to images ranging from 0-1 rating, which can be used  NSFW content. “Developers can use this score to filter images below a certain suitable threshold based on a ROC curve for specific use-cases, or use this signal to rank images in search results,” the blogpost explains.

Read: US government hands over remaining control of the Internet to ICANN

Yahoo will not release any of the training images or other details, keeping in mind the nature of the data. However, it has open sourced the output model which can be used by developers to classify NSFW content.