X

Xlm Mlm 17 1280

Developed by FacebookAI
The XLM model is a cross-lingual pretrained model based on text in 17 languages, using the masked language modeling (MLM) objective
Downloads 201
Release Time : 3/2/2022

Model Overview

This model is a Transformer pretrained with the masked language modeling objective, supporting cross-lingual understanding tasks in 17 languages

Model Features

Multilingual support
Supports cross-lingual understanding tasks in 17 languages
Large-scale pretraining
Pretrained on a large-scale multilingual corpus
Transformer architecture
Uses 16-layer Transformer architecture with 1280 hidden states

Model Capabilities

Cross-lingual text understanding
Masked language modeling
Multilingual text representation

Use Cases

Natural Language Processing
Cross-lingual text classification
Application in cross-lingual classification tasks like XNLI
Achieved good accuracy in English (84.8), Spanish (79.4), German (76.2), Arabic (71.5) and Chinese (75)
Multilingual text representation
Generate multilingual text representations for downstream NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase