N

Nsfw Image Classifier

Developed by quentintaranpino
A FocalNet fine-tuned NSFW image classification model for content moderation tasks, categorizing images into safe, needs review, and unsafe classes.
Downloads 199
Release Time : 2/16/2025

Model Overview

This model is an image classification model fine-tuned based on the FocalNet architecture, specifically designed to identify and classify Not Safe For Work (NSFW) content, suitable for content moderation scenarios.

Model Features

High-precision classification
Accurately classifies images into safe, needs review, and unsafe categories, suitable for content moderation.
Based on FocalNet architecture
Uses microsoft/focalnet-base as the base model, offering excellent image processing capabilities.
Fine-tuned with additional data
Incorporates additional image data provided by nostrcheck.me for fine-tuning, improving recognition accuracy and robustness.
Safe tensor format
The model adopts the safetensors format, ensuring security and compatibility.

Model Capabilities

Image classification
NSFW content detection
Content moderation

Use Cases

Content moderation
Social media content filtering
Automatically detects and filters inappropriate content on social media platforms.
Significantly reduces manual review workload and enhances content safety.
Adult content identification
Identifies and classifies adult or inappropriate content for further processing.
Improves platform content quality and protects users from exposure to inappropriate material.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase