X

Xlm Mlm En 2048

Developed by FacebookAI
XLM is a masked language model trained on English text, using BERT-style MLM objectives for pretraining, supporting English language processing tasks.
Downloads 1,734
Release Time : 3/2/2022

Model Overview

This model is the English-specific version of the cross-lingual language model series developed by Facebook AI Research, focusing on masked language modeling tasks.

Model Features

Cross-lingual Pretraining Architecture
Adopts a unified cross-lingual model architecture that can be extended to support multilingual processing
BERT-style Training Objective
Uses Masked Language Modeling (MLM) objectives for pretraining, suitable for contextual understanding tasks
Large-scale English Training
Focuses on deep pretraining for the English language, providing high-quality English language representations

Model Capabilities

English text understanding
Masked word prediction
Contextual representation learning

Use Cases

Natural Language Processing
Text Filling
Predict masked words
Can accurately predict missing words in context
Text Feature Extraction
Obtain deep semantic representations of text
Can be used as input features for downstream NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase