X

Xlm Mlm Xnli15 1024

Developed by FacebookAI
The XLM model is a Transformer-based model pre-trained with masked language modeling objectives, supporting cross-lingual text classification tasks in 15 languages.
Downloads 143
Release Time : 3/2/2022

Model Overview

This model is a multilingual Transformer-based model pre-trained with masked language modeling objectives and fine-tuned on English NLI datasets, capable of handling text classification tasks in 15 languages.

Model Features

Multilingual support
Supports text classification tasks in 15 languages, including English, French, Chinese, etc.
Cross-lingual transfer learning
After fine-tuning on English NLI datasets, it performs well on the other 14 languages.
Efficient training
Uses float16 operations to accelerate training and reduce memory usage, optimizing training efficiency.

Model Capabilities

Cross-lingual text classification
Natural language inference
Multilingual text processing

Use Cases

Natural language processing
Cross-lingual text classification
Classify texts in 15 different languages.
Performs well on the XNLI dataset, with accuracy ranging from 63.4% to 83.2%.
Natural language inference
Determine the logical relationship (entailment, contradiction, or neutral) between two sentences.
Achieves 83.2% accuracy on English NLI tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase