X

Xlm Mlm Enro 1024

Developed by FacebookAI
An English-Romanian Transformer model pretrained with masked language modeling objective, using language embeddings to specify inference language
Downloads 114
Release Time : 3/2/2022

Model Overview

This is a cross-lingual language model specifically pretrained for English and Romanian, primarily used for masked language modeling tasks.

Model Features

Cross-lingual Capability
Supports bilingual processing for English and Romanian
Language Embeddings
Uses language embeddings to specify the language during inference
Efficient Training
Utilizes float16 operations to accelerate training and reduce memory usage

Model Capabilities

Masked Language Modeling
Cross-lingual Text Understanding
Bilingual Text Generation

Use Cases

Natural Language Processing
Text Infilling
Predicts and fills masked words in text
Cross-lingual Transfer Learning
Serves as a pretrained model for downstream cross-lingual tasks
Featured Recommended AI Models
ยฉ 2025AIbase