- Community
- Model
- nsfw-recognition
aa47919c9a8d4d94bfa283121281bcc4
aa47919c9a8d4d94bfa283121281bcc4
a6b3a307361c4a00a465e962f721fc58
Notes
NSFW Visual Classifier
General Information
Purpose: This model predicts the likelihood that an image contains suggestive or sexually explicit nudity. It's a great solution for anyone trying to automatically moderate or filter nudity from their platform. Architecture: InceptionV2 with modifications Intended Use: Moderation; adult content filtering Limitations: This model is limited to nudity-specific use cases. For moderation use cases that include drugs and/or gore, use Clarifai’s image Moderation model.
Training Data & Taxonomy
Training Data
The model was trained and tested on an internal dataset with approximately 8M images, 0.4M of which were used as validation
Taxonomy
NSFW
SFW
- ID
- Namensfw-v1.0
- Model Type IDVisual Classifier
- DescriptionNSFW model for identifying whether images or video are safe for viewing (SFW) or not safe for viewing (NSFW) in American workplaces.
- Last UpdatedOct 25, 2024
- PrivacyPUBLIC
- Use Case
- Toolkit
- License
- Share
- Badge
Concept | Date |
---|