G

Greekbart

Developed by dascim
GreekBART is a Greek sequence-to-sequence pre-trained model based on BART, particularly suitable for generation tasks such as summarization.
Downloads 34
Release Time : 10/14/2024

Model Overview

The first pre-trained Greek sequence-to-sequence model, pre-trained by reconstructing corrupted input sentences, using a 76.9GB corpus of raw Greek text.

Model Features

Greek-specific
Specifically pre-trained and optimized for Greek, filling the gap in Greek sequence-to-sequence models.
Multi-task support
Provides a base model and three fine-tuned versions supporting summarization, headline generation, and sentiment classification tasks.
Large-scale pre-training
Pre-trained on 76.9GB of raw Greek text with strong language understanding capabilities.

Model Capabilities

Text summarization
News headline generation
Sentiment classification
Mask prediction

Use Cases

News media
News summarization
Automatically generates concise summaries from Greek news articles
Example results show accurate extraction of key information
News headline generation
Automatically generates attractive headlines for news content
Example headline 'Patras: Nurse testifies about Georgina's hospitalization'
Sentiment analysis
Comment sentiment classification
Classifies Greek text as positive/negative sentiment
Example accurately identifies 'Greek civilization is one of the richest and most widely recognized civilizations.' as positive
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase