New
Clarifai is recognized as a Leader in The Forrester Wave™: Computer Vision Tools, Q1 2024
January 19, 2017

How to Check User-Generated Content and Images for Unwanted Nudity

Table of Contents:

User-generated content is like a box of chocolates filled with potentially unwanted nudity – you never know what you’re going to get. Whether you’re a developer building an app that relies on user-uploaded listings (like AirBnb or eBay) or you’re helping a brand run an online contest with user-generated content (like GoPro), you can use this simple and effective method for checking user-generated content for Not Safe for Work subject matter and preventing it from being uploaded and shared.

 

Kevin Lewis Clarifai Javascript Developer

The Not Safe For Work (NSFW) model is super useful for moderation and filtering user-generated content, especially stopping questionable content from being seen by users on platforms where they are not expecting it. Recently, I built a really small JavaScript-based project which will automatically check that an image selected for upload in a form is Safe For Work (SFW) before it can be submitted.

Before we start though, it may be worth noting that this is a completely client-side based project. If you require something more robust, I’d recommend that you look into implementing a similar solution as part of your data validation server-side. It also required JavaScript, so it won’t work if this has been disabled.

Right, the disclaimers out of the way, let’s crack on. If you just want to jump ahead and look a the code, the GitHub repository is here, and the readme should be enough to get you started. The code is also heavily commented, so it should be good if you’re comfortable enough to jump straight in.

How is this going to work?

 

Video How to Upload Image to Clarifai APp

 

Clarifai Account


We’re going to disable the form submit button with JavaScript (this means that the form will still submit if JS is turned off), and then check if the image is SFW. If it is, we will enable the form submit. If it does not pass the test, the button remains disabled.

Get set up with this project
Firstly, we’ll need a Clarifai account. You can get one here. Create a new application and take note of your Client ID and Client Secret – don’t share these with anyone else.


Next, let’s set up our markup for this project. If you’re starting from scratch, create a file called index.html and make it look the same as mine. If you’re building this into an existing project, just make sure you have jQuery, and the Clarifai JavaScript client.


Now to create an keys.js file. It’s important that this is a separate file as it will house our Client ID and Secret. If you’re using git, please make sure to add this file to your .gitignore so you do not share this information.
Finally for project setup, I’d like you to create an options.js file. This is where our configuration will happen. Make it look like mine. Here’s a rundown of the options:

 

  • The SFW_LOWER_LIMIT variable sets the lowest acceptable SFW value (on a scale from 0 to 1) which will pass the test.
  • The FORM_ID variable is the ID given to the form we’re conducting a test on.
  • The FILE_CLASS variable is the class which the specific file input is given.
  • The function clarifaiCheckPass() will run if the user’s image passes the test, and clarifaiCheckFail() will run if it does not.

Let’s build this thing!

If you want to follow along with the finished JavaScript file, here it is.
First of all, let’s initialize a new application using the Clarifai JavaScript client.

var app = new Clarifai.App(CLIENT_ID, CLIENT_SECRET);

Next, a small bit of boilerplate code which will take an image file from an input and convert it to a Base64-encoded string. It’s this string which Clarifai needs to accept in order to run it through the NSFW model.

File.prototype.convertToBase64 = function(callback) {

  var reader = new FileReader();
  reader.onload = function(response) {
    base64 = response.target.result.replace(/^data:image\/(.*);base64,/, '');
    callback(base64);
  };
  reader.onerror = function(err) { callback(err); };
  reader.readAsDataURL(this);
};


Let’s introduce a way to visually see what the state of a file input is. We’re going to do this by adding the class ‘working’ to the input when it has been submitted to Clarifai, and then changing it to either ‘approved’ or ‘rejected’ once we get a result. I suggest styling them with CSS for quick visual feedback of state.

function stateClass(classToAdd) {
  $("#" + FORM_ID + " ." + FILE_CLASS).removeClass("working approved rejected");
  $("#" + FORM_ID + " ." + FILE_CLASS).addClass(classToAdd);
}

Now for the heavy lifting, although luckily Clarifai will make it a lot easier than you might expect. In this next function, we’re going to make a call to Clarifai. Below is the code, and we’ll run through a detailed explanation afterwards.

function validateFileInputs(image) {
  stateClass("working");
  app.models.predict(Clarifai.NSFW_MODEL, {base64: image}).then(
    function(response) {
      var pass; 
      data = JSON.parse(response.request.response);
      concepts = data.outputs[0].data.concepts;
      $.each(concepts, function(k, v) {
        if(v.name == "sfw") {
          pass = v.value > SFW_LOWER_LIMIT;
        }
      });
      parseResponse(pass);
    },
    function(err) { console.log(err); }
  )
};

We feed this function an image (by this point it would have been converted to a Base64-encoded string). We then use app.models.predict() and tell it which model we’re querying against (NSFW) and give it the image string.


It will then go through the results looking for the SFW concept and return true if the value is higher than the lower limit, and false if it does not. Finally, it calls the parseResponse() function which we’ll go through below with the true or false result.

parseResponse() is an incredibly simple function. It sets the correct state to the form input, and then calls clarifaiCheckPass() or clarifaiCheckFail(), which we define in the options file.

function parseResponse(pass) {
  if(pass === true) {
    stateClass("approved");
    clarifaiCheckPass();
  } else {
    stateClass("rejected");
    clarifaiCheckFail();
  }
}

Pulling it all together
So far we’ve written all of these useful function together, but we’re not ever calling them. There are two pieces of functionality left to get this application working – the first is to actually trigger the NSFW checker when the form file input is changed, and the second is to stop the form being submitted until the check has passed.

$(document).ready(function() {

  $("#" + FORM_ID + " ." + FILE_CLASS).on('change',function(){
    var selectedFile = this.files[0];
    selectedFile.convertToBase64(function(base64){
      validateFileInputs(base64);
    });
  });

  $("#" + FORM_ID + " input[type=submit]").click(function(e) {
    e.preventDefault();
    if($("#" + FORM_ID + " ." + FILE_CLASS).hasClass("approved")) {
      console.log("clicked");
      $(this).unbind('click').submit();
    }
  });
});

So wait, how does this work again?

NSFW Model For Checking UGC


There you have it – your very own nudity checker for your online forms. You can use this as a lightweight way to solve many problems, like making sure users on your dating app can only submit tasteful, non-nude pics, or making sure the photo contest you’re running doesn’t sear your eyeballs unexpectedly. Share your particular use case with @Clarifai for a chance to be featured in the blog!


♥️ this Copy+Paste guide – taught me how to make apps check if images are NSFW using #Javascript + #Clarifai API