B

Bart Large Chinese

Developed by fnlp
A Chinese pre-trained model based on the BART architecture, supporting text generation and understanding tasks, released by Fudan University's Natural Language Processing Laboratory
Downloads 638
Release Time : 3/2/2022

Model Overview

A pre-trained unbalanced Transformer model for Chinese understanding and generation, suitable for various natural language processing tasks

Model Features

Vocabulary Optimization
Adopts a new vocabulary of size 51271, supplementing 6800+ missing Chinese characters and removing redundant tokens, reducing the out-of-vocabulary rate
Long Sequence Support
Position encoding extended from 512 to 1024, supporting longer text sequence processing
Stable Performance
Maintains original performance levels through vocabulary alignment techniques and incremental training

Model Capabilities

Text Generation
Text Understanding
Text Summarization
Question Answering System
Text Completion

Use Cases

Text Generation
Capital Filling
Generate complete descriptions of national capitals
Input 'Beijing is the capital of [MASK]', output 'Beijing is the capital of the People's Republic of China'
Academic Research
Chinese NLP Research
Used as a baseline model for Chinese natural language processing task research
Performs well on benchmarks such as AFQMC and IFLYTEK
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase