R

Roberta Base Zeroshot V2.0 C

Developed by MoritzLaurer
A zero-shot classification model based on the RoBERTa architecture, designed for text classification tasks without requiring training data, supports both GPU and CPU operation, and is trained using fully business-friendly data.
Downloads 3,188
Release Time : 3/22/2024

Model Overview

This model is part of the zeroshot-v2.0 series, achieving universal text classification through Natural Language Inference (NLI) task formatting, suitable for multi-domain zero-shot classification scenarios.

Model Features

Business-friendly Data
Trained using synthetic data generated by Mixtral and MNLI/FEVER-NLI datasets with commercial licenses, meeting strict copyright requirements.
Zero-shot Classification
Performs classification tasks without training data, adapting to any classification labels via hypothesis templates.
Production Environment Optimization
Compatible with Hugging Face TEI inference containers and flash attention, suitable for deployment.

Model Capabilities

English Text Classification
Multi-domain Zero-shot Inference
Single-label/Multi-label Classification

Use Cases

Content Classification
News Topic Classification
Automatically categorizes news into predefined categories such as politics, economics, etc.
Achieves an average f1_macro of 0.72 (zero-shot) across 28 tasks.
Content Moderation
Violation Content Detection
Identifies whether text involves prohibited content such as violence or hate speech.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase