B

Bert Base Polish Uncased V1

Developed by dkleczek
Polish version of the BERT language model, offering both case-sensitive and case-insensitive variants, suitable for Polish natural language processing tasks.
Downloads 3,853
Release Time : 3/2/2022

Model Overview

Polbert is a Polish pre-trained language model based on the BERT architecture, supporting various downstream NLP tasks such as text classification and named entity recognition.

Model Features

Polish language optimization
Specially optimized for Polish language characteristics, correctly handling special characters and diacritics in Polish.
Whole word masking technique
Case-sensitive version employs whole word masking technique to enhance model comprehension.
Corpus optimization
Removed duplicate content and trained using a more balanced Polish corpus.

Model Capabilities

Text classification
Named entity recognition
Text filling
Semantic understanding

Use Cases

Text understanding
Poetry author identification
Identifying fragments of works by famous Polish poets
Correctly identified Adam Mickiewicz as 'pisarzem' (writer)
Academic research
Polish linguistic analysis
Used for studying Polish grammar and semantic features
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase