G

Gpt2 Wechsel French

Developed by benjamin
A French version of GPT-2 trained using the WECHSEL method, achieving cross-lingual transfer of monolingual language models through effective initialization of subword embeddings.
Downloads 33
Release Time : 3/2/2022

Model Overview

This model is a French language model based on the GPT-2 architecture, transferred from English using the WECHSEL method, suitable for French text generation tasks.

Model Features

Cross-lingual transfer
Efficient transfer from English to French using the WECHSEL method, significantly reducing training costs.
Efficient training
Compared to training from scratch, this method can reduce training costs by up to 64 times.
Superior performance
Outperforms models trained from scratch on French text generation tasks.

Model Capabilities

French text generation
Language model transfer

Use Cases

Natural Language Processing
French text generation
Generate fluent French text content
Perplexity 19.71, outperforming models trained from scratch (20.47)
Language model transfer
Transfer English language models to French
Significantly reduced training costs
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase