X

Xlm Mlm Tlm Xnli15 1024

Developed by FacebookAI
XLM is a cross-lingual Transformer model pre-trained with masked language modeling and translation language modeling objectives, supporting text classification tasks in 15 languages.
Downloads 198
Release Time : 3/2/2022

Model Overview

This model is based on the Transformer architecture, fine-tuned with multilingual pre-training and English NLI datasets, capable of handling cross-lingual text classification tasks in 15 languages.

Model Features

Cross-lingual capability
Through multilingual pre-training and translation language modeling objectives, the model can handle text classification tasks in 15 languages.
Efficient pre-training
Pre-trained using masked language modeling (MLM) and translation language modeling (TLM) objectives to optimize cross-lingual representations.
Multilingual evaluation
Comprehensively evaluated on the XNLI dataset in 15 languages, demonstrating strong cross-lingual transfer capabilities.

Model Capabilities

Cross-lingual text classification
Natural language inference
Multilingual text understanding

Use Cases

Natural language processing
Cross-lingual text classification
Perform text classification tasks in 15 languages
Achieved 67.3-85.0% accuracy on the XNLI dataset
Multilingual content analysis
Analyze text content in different languages and extract key information
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase