Purpose: This model predicts the likelihood that an image contains suggestive or sexually explicit nudity. It's a great solution for anyone trying to automatically moderate or filter nudity from their platform.
Architecture: InceptionV2 with modifications
Intended Use: Moderation; adult content filtering
Limitations: This model is limited to nudity-specific use cases. For moderation use cases that include drugs and/or gore, use Clarifai’s image Moderation model.
Training Data & Taxonomy
Training Data
The model was trained and tested on an internal dataset with approximately 8M images, 0.4M of which were used as validation
Taxonomy
NSFW
SFW
ID
Model Type ID
Visual Classifier
Input Type
image
Output Type
concepts
Description
NSFW model for identifying whether images or video are safe for viewing (SFW) or not safe for viewing (NSFW) in American workplaces.