Nomic Xlm 2048
A fine-tuned version based on the XLM-Roberta base model, using RoPE (Rotary Position Embedding) to replace the original positional embeddings, supporting 2048 sequence length
Downloads 440
Release Time : 11/16/2024
Model Overview
Multilingual encoder model suitable for masked language modeling and sequence classification tasks, with excellent performance on GLUE and XTREME-R benchmarks
Model Features
Rotary Position Embedding (RoPE)
Uses RoPE to replace the original absolute positional encoding, supporting longer sequence lengths
Long sequence support
Supports processing sequences up to 2048 in length, surpassing the original XLM-R's 512-length limit
Multilingual capability
Performs well on various multilingual tasks, especially on the XTREME-R benchmark
Model Capabilities
Masked language modeling
Sequence classification
Multilingual text understanding
Use Cases
Natural Language Processing
Multilingual text classification
Classification tasks for multilingual texts
Performs well on cross-lingual understanding tasks such as XNLI
Long document processing
Handling long-sequence text tasks
Supports processing sequences up to 2048 in length
Featured Recommended AI Models