X

Xlm Roberta Large Xnli

Developed by joeddav
Based on the xlm-roberta-large pre-trained model, fine-tuned on NLI data in 15 languages, specifically designed for zero-shot text classification
Downloads 109.12k
Release Time : 3/2/2022

Model Overview

Supports multilingual zero-shot text classification tasks, particularly suitable for non-English languages, fine-tuned on the cross-lingual NLI dataset XNLI

Model Features

Multilingual Support
Supports zero-shot classification in 15 languages, with the base model pre-trained on 100 languages
Cross-Lingual Capability
Labels and texts to be classified can be in different languages, enabling cross-lingual classification
NLI Fine-Tuning
Fine-tuned on MNLI and XNLI datasets for natural language inference tasks

Model Capabilities

Zero-Shot Text Classification
Multilingual Text Understanding
Cross-Lingual Inference

Use Cases

Text Classification
Political Text Classification
Multi-label classification of political-related texts (e.g., elections, foreign policy, etc.)
Can accurately identify the political domain of the text
Cross-Lingual Content Moderation
Classification and moderation of multilingual user-generated content
No need to train separate models for each language
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase