X

Xlm Mlm 100 1280

Developed by FacebookAI
The XLM model is a cross-lingual language model pre-trained on Wikipedia texts in 100 languages using masked language modeling objectives.
Downloads 296
Release Time : 3/2/2022

Model Overview

This model is a multilingual language model based on the Transformer architecture, supporting 100 languages, primarily used for cross-lingual understanding and generation tasks.

Model Features

Multilingual Support
Supports cross-lingual understanding and generation tasks in 100 languages
Large-scale Pre-training
Pre-trained on Wikipedia texts in 100 languages
Transformer Architecture
Utilizes a 16-layer Transformer architecture with 1280 hidden states and 16 attention heads

Model Capabilities

Cross-lingual Text Understanding
Masked Language Modeling
Multilingual Text Generation

Use Cases

Cross-lingual Classification
XNLI Cross-lingual Classification
Performs cross-lingual natural language inference tasks on the XNLI dataset
83.7% accuracy in English, 71.7% accuracy in Chinese, etc.
Language Understanding
Multilingual Text Understanding
Understands text content in 100 languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase