Randeng BART 139M SUMMARY
A Chinese text summarization model based on the BART architecture with 139 million parameters, excelling in text summarization tasks.
Downloads 504
Release Time : 4/26/2022
Model Overview
This model is a specialized version for summarization, fine-tuned from Randeng-BART-139M on the Chinese text summarization dataset (LCSTS), primarily used for generating summaries of Chinese texts.
Model Features
Optimized for Chinese Text Summarization
Fine-tuned specifically for Chinese text summarization tasks, capable of generating high-quality summaries.
Based on BART Architecture
Utilizes the BART-base architecture, combining the advantages of bidirectional encoders and autoregressive decoders.
Compact and Efficient Model
With only 139 million parameters, it maintains performance while offering high inference efficiency.
Model Capabilities
Text Summarization Generation
Chinese Natural Language Processing
Text Compression
Use Cases
News Summarization
Sports News Summarization
Automatically generates concise summaries of sports event reports
As shown in examples, it can accurately extract key match information and results
Content Summarization
Long Article Summarization
Compresses lengthy articles into brief summaries
Preserves key information from the original text while removing redundant content
Featured Recommended AI Models