L

Lsg Bart Base 16384 Mediasum

Developed by ccdv
A BART model based on LSG technology, optimized for long-sequence summarization tasks, supporting input sequences up to 16,384 tokens in length
Downloads 22
Release Time : 6/23/2022

Model Overview

This model employs a local-sparse-global attention mechanism, fine-tuned on the MediaSum dataset, and is suitable for long-text summarization tasks

Model Features

Long Sequence Processing Capability
Supports input sequences up to 16,384 tokens, making it suitable for long-document summarization tasks
Hybrid Attention Mechanism
Combines local, sparse, and global attention to effectively capture long-range dependencies
Efficient Fine-Tuning
Achieves fine-tuning within 1 epoch, maintaining high performance while conserving computational resources

Model Capabilities

Long Text Summarization
Text Compression
Key Information Extraction

Use Cases

Media Content Processing
News Article Summarization
Automatically generates core content summaries of news articles
Rouge-L score 31.81
Interview Transcript Summarization
Extracts key dialogue points from lengthy interview transcripts
Rouge-1 score 35.31
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase