• Community
  • Model
  • nsfw-clustering

nsfw-clustering

Not safe for work (NSFW) model for grouping visually similar images that are not safe for viewing in American workplaces.

Notes

NSFW Clusterer

General Information

Purpose: This model groups similar vector embeddings into clusters based on the likelihood that an image contains suggestive or sexually explicit nudity. It's a great solution for anyone trying to automatically moderate or filter nudity from their platform. Architecture: InceptionV2 with modifications Intended Use: Moderation; adult content filtering Limitations: This model is limited to nudity-specific use cases. For moderation use cases that include drugs and/or gore, use Clarifai’s image Moderation model.

Training Data & Taxonomy

Training Data

The model was trained and tested on an internal dataset with approximately 8M images, 0.4M of which were used as validation

Taxonomy

NSFW
SFW
  • ID
  • Name
    nsfw
  • Model Type ID
    Clusterer
  • Description
    Not safe for work (NSFW) model for grouping visually similar images that are not safe for viewing in American workplaces.
  • Last Updated
    Oct 25, 2024
  • Privacy
    PUBLIC
  • Use Case
  • Toolkit
  • License
  • Share
    • Badge
      nsfw-clustering