R

Roberta Large Zeroshot V2.0 C

Developed by MoritzLaurer
A RoBERTa-large model designed for efficient zero-shot classification, trained on commercially friendly data, capable of performing text classification tasks without training data.
Downloads 53
Release Time : 3/22/2024

Model Overview

This model is part of the zeroshot-v2.0 series, optimized for zero-shot classification tasks, and can run on both GPU and CPU. Specifically designed for commercially friendly scenarios, using only compliant training data.

Model Features

Commercially friendly data training
Trained exclusively on synthetic data generated by Mixtral-8x7B and commercially friendly NLI datasets (MNLI, FEVER-NLI), suitable for scenarios with strict compliance requirements.
Zero-shot classification capability
Performs classification tasks without training data, achieving general text classification through the Natural Language Inference task format.
Efficient inference
Compatible with Hugging Face's production inference TEI container and flash attention, suitable for production environment deployment.

Model Capabilities

Zero-shot text classification
Multi-label classification
Natural Language Inference

Use Cases

Sentiment analysis
Product review classification
Classify e-commerce platform reviews as positive/negative sentiment
Achieved 0.977 F1 score on Yelp review dataset
Content moderation
Toxic content detection
Identify hate speech, insults, and other toxic content in text
Achieved 0.854 F1 score on Wikipedia toxic obscenity classification task
Topic classification
News classification
Classify news articles by topic (e.g., politics, economics)
Achieved 0.745 F1 score on AG News dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase