T

T5 Small Wikilingua Vietnamese

Developed by minhtoan
State-of-the-art lightweight pretrained model for Vietnamese based on Transformer encoder-decoder architecture, specialized for text summarization tasks.
Downloads 43
Release Time : 11/24/2022

Model Overview

This model is a lightweight pretrained model specifically designed for abstract text summarization in Vietnamese. Based on the Transformer encoder-decoder architecture and trained on the Vietnamese Wikilingua dataset, it can generate high-quality text summaries.

Model Features

Lightweight Design
The model adopts a small-scale design, suitable for deployment in resource-constrained environments
Vietnamese Optimization
Specially optimized and trained for Vietnamese text
Long Text Processing
Supports input texts up to 512 tokens in length

Model Capabilities

Vietnamese text comprehension
Abstract text summarization generation
Long text compression

Use Cases

Content Summarization
News Summarization
Automatically generates key summaries for Vietnamese news articles
Produces concise summaries within 256 tokens
Document Condensation
Compresses long documents into brief summaries
Retains core information from the original text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase