B

Bartpho Word

Developed by vinai
BARTpho is the first pre-trained sequence-to-sequence model for Vietnamese, including syllable and word versions, suitable for generative natural language processing tasks.
Downloads 3,584
Release Time : 3/2/2022

Model Overview

BARTpho is a large-scale monolingual sequence-to-sequence pre-trained model specifically designed for Vietnamese, adopting BART's 'large' architecture and pre-training scheme, excelling in generative tasks such as text summarization.

Model Features

Vietnamese-specific model
The first sequence-to-sequence pre-trained model optimized for Vietnamese
Dual-version support
Provides syllable and word versions to accommodate different processing needs
Superior generative performance
Outperforms baseline models like mBART in Vietnamese text summarization tasks

Model Capabilities

Vietnamese text generation
Sequence-to-sequence transformation
Text summarization

Use Cases

Text processing
Vietnamese text summarization
Automatically generates summaries for Vietnamese texts
Outperforms the mBART model in both automatic and human evaluations
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase