M

Modernbert Base Ita

Developed by DeepMount00
ModernBERT is a modern bidirectional encoder-only Transformer model (BERT-style), pre-trained on 2 trillion tokens of English and code data, with a native context length of up to 8,192 tokens.
Downloads 81
Release Time : 12/19/2024

Model Overview

ModernBERT is a modern bidirectional encoder-only Transformer model, suitable for tasks involving long documents, such as retrieval, classification, and semantic search in large-scale corpora.

Model Features

Rotary Position Embedding (RoPE)
Supports long-context processing.
Local-global alternating attention
Improves efficiency for long inputs.
Depadding and Flash Attention
Enables efficient inference.
Native support for long context
Native context length of up to 8,192 tokens.

Model Capabilities

Masked language modeling
Long-context processing
Semantic search
Code retrieval
Text classification

Use Cases

Natural Language Processing
Text classification
Classification tasks for long documents.
Semantic search
Semantic search in large-scale corpora.
Code processing
Code retrieval
Retrieval tasks in codebases.
Achieved state-of-the-art results for code retrieval on CodeSearchNet and StackQA.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase