T5 Base Japanese Web
T5 model pretrained on Japanese web text with byte fallback support and a 32K vocabulary size
Downloads 4,917
Release Time : 3/2/2022
Model Overview
This is a T5 (Text-to-Text Transfer Transformer) model optimized for Japanese text, specifically designed for various NLP tasks involving Japanese web content.
Model Features
Large Vocabulary Support
Features a 32K vocabulary size for better handling of Japanese text
Byte Fallback Functionality
Supports byte fallback to enhance handling of unknown words
Large-Scale Pretraining
Pretrained on the Japanese portion of mC4 and Japanese Wikipedia, covering a wide range of web text
TPU-Optimized Training
Efficiently trained on TPU v3-8, completing 1 million steps in approximately 126 hours
Model Capabilities
Japanese Text Understanding
Japanese Text Generation
Text Conversion Tasks
Language Model Fine-Tuning
Use Cases
Natural Language Processing
Japanese Text Summarization
Automatically summarize Japanese articles
Japanese Question Answering System
Build a knowledge-based QA system for Japanese
Japanese Text Classification
Perform multi-category classification on Japanese text
Featured Recommended AI Models
ยฉ 2025AIbase