P

Palobert Base Greek Uncased V1

Developed by gealexandri
Greek language model based on RoBERTa, specifically designed for Greek social media text
Downloads 20
Release Time : 3/2/2022

Model Overview

PaloBERT is a Greek pre-trained language model based on the RoBERTa architecture, primarily used for processing and analyzing Greek social media text. The model was trained on 458,293 Greek social media documents and includes a specially trained GTP-2 tokenizer.

Model Features

Greek language optimization
Specifically trained and optimized for Greek social media text
Custom tokenizer
Includes a GTP-2 tokenizer trained from scratch on the same dataset
Large-scale training data
Trained on 458,293 Greek social media documents

Model Capabilities

Greek text understanding
Social media text analysis
Sentiment analysis
Opinion mining

Use Cases

Sentiment analysis
Greek social media sentiment analysis
Analyze sentiment tendencies of Greek social media users
Opinion mining
Greek user opinion extraction
Mine user opinions from Greek social media text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase