N

Nsfw Image Detection

Developed by Ateeqq
This model is specifically fine-tuned for NSFW image classification, categorizing content into three safety-critical classes, suitable for content moderation, safety filtering, and compliant content processing systems.
Downloads 24.89k
Release Time : 5/4/2025

Model Overview

An NSFW image classification model fine-tuned on google/siglip2-base-patch16-224, capable of accurately identifying content not suitable for the workplace.

Model Features

High Accuracy
Achieves 99.02% accuracy on the validation set.
Multi-category Classification
Supports three content categories: violence and gore, pornography, and safe content.
Efficient Training
Requires only 5 training epochs to achieve high performance.

Model Capabilities

NSFW image detection
Content safety classification
Violence and gore content identification
Pornography content identification

Use Cases

Content Moderation
Social Media Content Filtering
Automatically detects and filters images unsuitable for the workplace.
Accurately identifies over 99% of non-compliant content.
Compliant Content Processing
Ensures platform content meets safety standards.
Reduces manual review workload.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase