Yahoo Open NSFW Model V1

Determine whether an image contains pornographic content

Released in 2016 by Yahoo, this net is a binary classifier (safe/not safe) fine-tuned from the pre-trained ResNet-50 model. It determines whether an image is not suitable/safe for work (NSFW), due to the presence of nudity and/or pornographic content. The original model was chosen to provide a good tradeoff between accuracy and computational weight.

Number of layers: 177 | Parameter count: 5,944,514 | Trained size: 24 MB |

Training Set Information

Examples

Resource retrieval

Get the pre-trained net:

In[1]:=
NetModel["Yahoo Open NSFW Model V1"]
Out[1]=

Basic usage

Apply the trained net to an input image:

In[2]:=
(* Evaluate this cell to get the example input *) CloudGet["https://www.wolframcloud.com/obj/479791ef-28d2-4c7e-b097-3646cb064a04"]
Out[2]=

Obtain the probabilities for each class for two images:

In[3]:=
(* Evaluate this cell to get the example input *) CloudGet["https://www.wolframcloud.com/obj/a556f73d-df3e-42d8-9e31-ccb1da45389a"]
Out[3]=

Requirements

Wolfram Language 11.2 (September 2017) or above

Resource History

Reference