In Built with Clarifai

How to use Clarifai to protect your eyes from seeing something they can’t unsee

By Cassidy Williams

Follow along with our simple tutorial on how to use the Not Safe for Work (NSFW) model to protect your eyeballs from things they can’t unsee. Or, check out our documentation to build something on your own!

unseeAh, NSFW. The forbidden fruit. The illegal indulgence. For those of you who don’t know what NSFW is, we’re talking about Not Safe For Work. R-rated content. Stuff that would probably get you fired if you were looking at it at work.

Never fear, pervs and pervettes. We at Clarifai have a new endpoint to protect you from yourselves.

Let’s just say your weirdo friend Bob sends you a link to an image. But because of his dark past, you don’t want to just open this pic at work. That’s where we come in.

So, first thing’s first, sign up for a free Clarifai developer account. Simple enough. They seem cool. Useful. Dangerous.


Now, make a new application. You can name it anything you’d like:


Once you’ve done this, the sly developer site redirects you to a Manage Applications page where you can find your Client ID and Client Secret.  Love it.

Let’s take these with us, shall we?

Now, let’s beat Bob’s game with some Python. Ugh, not that Python, you freak.

Create a new folder named after the street you grew up on and the name of your first pet, open it up in your terminal, and run the following:

pip install git+git://
export CLARIFAI_APP_ID=<application_id_from_your_account>
export CLARIFAI_APP_SECRET=<application_secret_from_your_account>

In the <application_id_from_your_account>, you put your Client ID.  In the <application_secret_from_your_account>, you put your Client Secret, not what you really think about Bob.

Now let’s get crazy and code.  Make a new Python file called and stick this in it:

from clarifai.client import ClarifaiApi
import sys

clarifai_api = ClarifaiApi(model='nsfw-v0.1')
result = clarifai_api.tag_image_urls(str(sys.argv[1]))
result = result['results'][0]['result']['tag']['probs']

if result[0] > result[1]:
    print 'Safe for work, you can trust Bob again (for now)!'
    print 'Not safe for work. Classic Bob.'

Save this file, and then go back to your terminal and run:

python <bobs_image_url>

… and of course, stick Bob’s image URL in the <bobs_image_url> blurb.

Boo yah. You’re officially protected from Bob, at least in this sense. If you want to check out the GitHub repo for the code, here you go!

You can thank us @Clarifai for whipping out this endpoint for you, pun intended. <3