Long T5 Tglobal Base 16384 Booksum V12
An optimized long-text summarization model based on the T5 architecture, capable of processing inputs up to 16,384 tokens, excelling in book summarization tasks.
Downloads 109
Release Time : 9/9/2022
Model Overview
This model is specifically optimized for long-document summarization tasks, utilizing the T5 architecture with extended capabilities for handling lengthy texts, suitable for generating summaries of books, scientific papers, and other long-form content.
Model Features
Extended Context Handling
Supports processing input texts up to 16,384 tokens, ideal for lengthy content like book chapters
Domain-Specific Optimization
Trained specifically on the BookSum dataset, delivering outstanding results for academic literature and book content summarization
Multi-Scale Summarization
Capable of generating summaries of varying lengths (8-64 tokens) to meet diverse needs
Model Capabilities
Long Text Summarization
Content Summarization
Book Chapter Summarization
Scientific Paper Summarization
Technical Document Summarization
Use Cases
Academic Research
Rapid Paper Reading
Generates concise summaries for lengthy academic papers, helping researchers quickly grasp core content
Achieves a ROUGE-1 score of 30.00 on scientific paper summarization tasks
Publishing Industry
Book Content Summarization
Automatically generates book chapter summaries for use in tables of contents, reading guides, and other publishing scenarios
Achieves a ROUGE-1 score of 36.14 on the BookSum dataset
Government Reports
Policy Document Summarization
Extracts key information from lengthy government reports
Achieves a ROUGE-1 score of 37.05 on the gov_report dataset
Featured Recommended AI Models