R

Roberta El News

Developed by cvcio
RoBERTa model pretrained on Greek news data, specializing in masked language modeling tasks
Downloads 51
Release Time : 3/2/2022

Model Overview

This is a RoBERTa model pretrained on Greek news data using masked language modeling (MLM) objectives, suitable for Greek text processing tasks.

Model Features

Trained on Greek News Data
Pretrained on 8 million Greek news articles (approximately 160 million sentences) from 2016-2021
Preserves Diacritics
Retains all diacritical marks when processing Greek text
Case Insensitive
Model is insensitive to text casing
Efficient Tokenization
Uses BPE tokenizer with a vocabulary of 50,265

Model Capabilities

Greek Text Understanding
Masked Language Prediction
Named Entity Recognition (with fine-tuning)

Use Cases

News Analysis
Political News Analysis
Analyzing key information in Greek political news
Successfully predicted key terms in political reports in examples
Text Completion
News Text Completion
Predicting masked words in news texts
Accurately predicted words like 'public' and 'release' in examples
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase