- clarifai
- image-moderation
nsfw-recognition
NSFW model for identifying whether images or video are safe for viewing (SFW) or not safe for viewing (NSFW) in American workplaces.
Overview
1
Notes
NSFW Recognition Workflow
This workflow is wrapped around NSFW Recognition Classifier that classifies the images as nudity, sexually-explicit imagery.
The NSFW recognition workflow predicts the likelihood that an image contains suggestive or sexually explicit nudity. It's a great solution for anyone trying to moderate or filter nudity from their platform automatically. It is limited to nudity-specific use cases.
Run NSFW Recognition workflow
Using Clarifai SDK
Export your PAT as an environment variable. Then, import and initialize the API Client.
Find your PAT in your security settings.
export CLARIFAI_PAT={your personal access token}
Prediction with the workflow
from clarifai.client.workflow import Workflow
workflow_url = 'https://clarifai.com/clarifai/image-moderation/workflows/nsfw-recognition'
image_url = "https://samples.clarifai.com/metro-north.jpg"
summarisation = Workflow(workflow_url).predict_by_url(image_url, input_type="image")
# Get workflow results
print(prediction.results[0].outputs[-1].data)
Using Workflow
To utilize the NSFW Recognition workflow, you can input images through the Blue Plus Try your own Input button and it will output the probability distribution of concepts.
- Workflow IDnsfw-recognition
- DescriptionNSFW model for identifying whether images or video are safe for viewing (SFW) or not safe for viewing (NSFW) in American workplaces.
- Last UpdatedApr 01, 2024
- PrivacyPUBLIC
- Share