R

Randeng T5 77M

Developed by IDEA-CCNL
A lightweight Chinese version of mT5-small model specialized in natural language transformation tasks
Downloads 104
Release Time : 6/8/2022

Model Overview

Adapted from mT5-small architecture for Chinese language, focusing on natural language transformation tasks, suitable for Chinese text processing

Model Features

Chinese Optimization
Specially optimized for Chinese language, enhancing Chinese text processing capabilities
Lightweight Model
Only 77M parameters, suitable for resource-constrained environments
Efficient Training
Utilized Corpus-Adaptive Pre-training (CAPT) technique for incremental training on 180GB WuDao corpus, achieving high training efficiency

Model Capabilities

Text Generation
Natural Language Transformation

Use Cases

Text Processing
Text Completion
Complete missing text fragments based on context
Example input: 'Beijing has a long history of <extra_id_0> and <extra_id_1>.'
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase