M

Modernbert Pl

Developed by AleksanderObuchowski
A model based on the ModernBERT architecture, trained on a 24GB parallel sentence corpus from OpenSubtitles with Polish language alignment using cross-tokenization technology.
Downloads 61
Release Time : 3/31/2025

Model Overview

This model is a Polish pre-trained model based on the ModernBERT architecture, primarily used for masked language modeling tasks.

Model Features

Cross-tokenization technology
Trained on parallel sentence corpora using cross-tokenization technology, potentially enhancing the model's understanding of Polish.
Polish language alignment
Specifically trained and optimized for Polish, suitable for Polish-related natural language processing tasks.

Model Capabilities

Masked language modeling
Polish text understanding

Use Cases

Natural language processing
Polish text infilling
Used to fill in missing parts of Polish text.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase