T

T5 Base Nl36 Finnish

Developed by Finnish-NLP
A T5 model pre-trained on Finnish using span-based masked language modeling objectives, requires fine-tuning for downstream tasks
Downloads 19
Release Time : 4/15/2022

Model Overview

This is a T5 model pre-trained in a self-supervised manner on a large corpus of Finnish text, employing an encoder-decoder architecture that frames all NLP problems as text-to-text tasks. The model requires fine-tuning for specific tasks before practical application.

Model Features

Efficient Deep Architecture
Uses a deep-narrow 36-layer transformer architecture that outperforms the standard 12-layer T5-base
Improved Pre-training Techniques
Incorporates T5 v1.1 improvements: GEGLU activation, no dropout during pre-training, pure MLM objective training
High-quality Training Data
Trained on 76GB of rigorously cleaned Finnish text from diverse sources including Wikipedia and news

Model Capabilities

Text generation
Text transformation
Sequence-to-sequence tasks

Use Cases

Text Processing
Case and Punctuation Correction
After fine-tuning, can automatically correct case and punctuation errors in Finnish text
Refer to Finnish-NLP/t5-small-nl24-casing-punctuation-correction model
Text Classification
News Classification
Achieved 94.4% accuracy when fine-tuned on Yle news dataset
Outperforms multilingual mT5 models of similar parameter size
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase