T

Text Generation News Gpt2 Small Hungarian

Developed by NYTK
A GPT-2 model pretrained on Hungarian Wikipedia and fine-tuned on Hungarian news corpus, specifically designed for Hungarian news text generation
Downloads 270
Release Time : 3/2/2022

Model Overview

This model is a Hungarian text generation model based on the GPT-2 architecture, pretrained on Wikipedia and fine-tuned on Hungarian news websites (hvg.hu, index.hu, nol.hu), specifically designed for generating Hungarian news content.

Model Features

Hungarian language optimization
Specifically pretrained and fine-tuned for Hungarian, excelling in Hungarian text generation tasks
News domain fine-tuning
Fine-tuned on corpus from mainstream Hungarian news websites, particularly suitable for news text generation
Low perplexity
Achieves a perplexity of 22.06 in news generation tasks, outperforming poetry generation models (47.46)

Model Capabilities

Hungarian text generation
News content creation
Headline generation

Use Cases

News media
Automatic news generation
Automatically generate news content based on key information
Can generate texts conforming to Hungarian news style
News headline generation
Generate attractive headlines based on news content
Content creation
Hungarian creative writing
Assist in Hungarian content creation
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase