• Community
  • Model
  • nsfw-recognition

nsfw-recognition

NSFW model for identifying whether images or video are safe for viewing (SFW) or not safe for viewing (NSFW) in American workplaces.

Notes

NSFW Visual Classifier

General Information

Purpose: This model predicts the likelihood that an image contains suggestive or sexually explicit nudity. It's a great solution for anyone trying to automatically moderate or filter nudity from their platform. Architecture: InceptionV2 with modifications Intended Use: Moderation; adult content filtering Limitations: This model is limited to nudity-specific use cases. For moderation use cases that include drugs and/or gore, use Clarifai’s image Moderation model.

Training Data & Taxonomy

Training Data

The model was trained and tested on an internal dataset with approximately 8M images, 0.4M of which were used as validation

Taxonomy

NSFW
SFW
  • ID
  • Name
    nsfw-v1.0
  • Model Type ID
    Visual Classifier
  • Description
    NSFW model for identifying whether images or video are safe for viewing (SFW) or not safe for viewing (NSFW) in American workplaces.
  • Last Updated
    Oct 25, 2024
  • Privacy
    PUBLIC
  • Use Case
  • Toolkit
  • License
  • Share
    • Badge
      nsfw-recognition