T

T5 Base Dutch

Developed by yhavinga
This is a Dutch pre-trained model based on the T5 architecture, with 222 million parameters, trained on the cleaned Dutch mC4 dataset.
Downloads 102
Release Time : 3/2/2022

Model Overview

This model adopts the T5 architecture and is pre-trained with masked language modeling objectives, suitable for downstream NLP tasks requiring fine-tuning.

Model Features

Dutch Optimization
Specifically pre-trained for Dutch using the cleaned Dutch mC4 dataset
T5 Architecture
Adopts the standard T5-base architecture, supporting text-to-text transformation tasks
Efficient Pre-training
Trained for 1 epoch on TPU, taking 2 days and 9 hours, processing 35 billion tokens

Model Capabilities

Text generation
Text summarization
Machine translation
Text classification

Use Cases

Text Processing
News Summarization
Can be used to generate summaries of Dutch news articles
Achieved a Rouge1 score of 0.70 in evaluation
English-Dutch Translation
After fine-tuning, it can be used for translation between English and Dutch
Achieved a Bleu score of 0.78 in evaluation
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase