G

Gpt2 Finetuned Greek

Developed by lighteternal
A Greek text generation model fine-tuned from the English GPT-2 model, jointly developed by the Hellenic Military Academy and the Technical University of Crete
Downloads 178
Release Time : 3/2/2022

Model Overview

This is a text generation model optimized for Greek, based on the OpenAI GPT-2 architecture, fine-tuned through gradual layer unfreezing, suitable for Greek text generation tasks

Model Features

Efficient Fine-tuning Method
Utilizes gradual layer unfreezing for fine-tuning, more efficient than training from scratch, especially suitable for low-resource languages
Large-scale Training Data
Trained on approximately 23.4GB of Greek corpus, containing text data from multiple sources
Pre-trained Model Transfer
Fine-tuned from the English GPT-2 model, leveraging the knowledge of the pre-trained model

Model Capabilities

Greek text generation
Language model continuation
Creative writing assistance

Use Cases

Text Generation
Story Continuation
Generates coherent story content based on a given beginning
Produces coherent text that conforms to Greek grammar and context
Content Creation Assistance
Assists writers or content creators in generating creative text
Provides diverse options for text continuation
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase