M

Multilingual MiniLMv2 L12 Mnli Xnli

Developed by MoritzLaurer
A multilingual natural language inference model supporting over 100 languages, suitable for zero-shot classification tasks
Downloads 245
Release Time : 2/11/2023

Model Overview

This model is distilled from the multilingual MiniLM-L12 model, specifically designed for multilingual natural language inference (NLI) tasks and applicable to zero-shot text classification.

Model Features

Multilingual Support
Supports natural language processing tasks in over 100 languages
Efficient Inference
Smaller and faster than the original XLM-RoBERTa-large model while maintaining good performance
Zero-shot Classification
Capable of classification without task-specific training
Cross-lingual Transfer Capability
Can handle tasks in languages not included in the training data

Model Capabilities

Multilingual text classification
Natural language inference
Zero-shot learning
Cross-lingual transfer

Use Cases

Text Classification
News Classification
Classify news articles into categories such as politics, economy, entertainment, etc.
Average accuracy of 75% on XNLI test set
Content Moderation
Multilingual Content Classification
Identify sensitive or inappropriate content in multilingual materials
Featured Recommended AI Models
ยฉ 2025AIbase