M

Multilingual MiniLMv2 L6 Mnli Xnli

Developed by MoritzLaurer
Multilingual zero-shot classification model supporting 100+ languages, lightweight through distillation
Downloads 2,521
Release Time : 2/11/2023

Model Overview

This model is obtained by distilling XLM-RoBERTa-large and fine-tuned on XNLI and MNLI datasets, suitable for multilingual natural language inference and zero-shot classification tasks

Model Features

Multilingual support
Supports zero-shot classification in 100+ languages, with special evaluation on 15 major languages
Lightweight design
Model compression through distillation, offering faster inference and lower memory requirements compared to original large models
Cross-lingual transfer capability
Capable of classification even without specific language training data, demonstrating cross-lingual reasoning ability

Model Capabilities

Multilingual text classification
Natural language inference
Zero-shot learning
Cross-lingual transfer

Use Cases

Text classification
News categorization
Zero-shot topic classification for multilingual news content (politics/economy/entertainment etc.)
Achieved 72.1% accuracy on Chinese test set
Semantic understanding
Semantic relation judgment
Determine entailment/neutral/contradiction relationships between sentences
Average 71.3% accuracy on XNLI test set
Featured Recommended AI Models
ยฉ 2025AIbase