Belgpt2
BelGPT-2 is a GPT-2 model pre-trained on a massive French corpus (approximately 60GB), specializing in French text generation tasks.
Downloads 773
Release Time : 3/2/2022
Model Overview
The first GPT-2 model pre-trained on a large-scale heterogeneous French corpus, supporting high-quality French text generation.
Model Features
Large-scale French corpus training
Pre-trained on approximately 60GB of heterogeneous French corpus, covering various text types.
High-quality text generation
Capable of generating fluent and coherent French text.
Diverse training data
Training data includes CommonCrawl, news, Wikipedia, literary works, and other sources.
Model Capabilities
French text generation
Context continuation
Creative writing
Use Cases
Content creation
News continuation
Automatically generate follow-up content based on the beginning of a news article
Creative writing
Generate creative texts such as novels and poems
Education
French learning assistance
Generate French learning materials and practice texts
Featured Recommended AI Models