B

Bert Base Greek Uncased V1

Developed by nlpaueb
GreekBERT is a pre-trained language model for Greek, suitable for various Greek natural language processing tasks.
Downloads 5,984
Release Time : 3/2/2022

Model Overview

This model is a BERT model pre-trained on Greek text, capable of handling NLP tasks such as masked language modeling, text classification, and named entity recognition.

Model Features

Greek-specific Pretraining
Pre-trained specifically for Greek, enabling better understanding and processing of Greek text
Deaccentuation Support
Supports deaccentuation and lowercasing of Greek text to enhance model robustness
Multi-domain Pretraining Data
Pre-trained using Wikipedia, European Parliament proceedings, and OSCAR corpus

Model Capabilities

Masked Language Modeling
Text Classification
Named Entity Recognition
Natural Language Inference

Use Cases

Text Understanding
Greek Named Entity Recognition
Identify entities such as person and location names in Greek text
Achieved 85.7 F1 score on Greek NER dataset
Semantic Analysis
Natural Language Inference
Determine logical relationships between Greek sentences
Achieved 78.6% accuracy on XNLI dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase