M

Mlong T5 Large Sumstew

Developed by Joemgu
This is a multilingual, long-text (supports up to 16k input tokens) abstractive summarization model. Trained on the sumstew dataset, it can generate titles and summaries for given input documents.
Downloads 103
Release Time : 6/11/2023

Model Overview

Based on the T5 architecture, this model is specifically designed for multilingual text summarization tasks, excelling in handling long-text inputs (up to 16k tokens). It supports five languages: English, German, French, Italian, and Spanish, and can generate both titles and summaries simultaneously.

Model Features

Multilingual Support
Supports summarization in five languages: English, German, French, Italian, and Spanish.
Long Text Processing Capability
Supports up to 16k input tokens, making it suitable for long-document summarization tasks.
Joint Title + Summary Generation
Can generate both document titles and summaries simultaneously, with flexible output formats.
Pretrained + Fine-tuned Architecture
Based on the T5 architecture, fine-tuned specifically on the sumstew dataset.

Model Capabilities

Text Summarization
Title Generation
Multilingual Text Processing
Long Text Comprehension

Use Cases

Content Summarization
News Article Summarization
Automatically generates key summaries for news articles.
ROUGE-1 score 29.7108 (on samsum test set)
Academic Paper Summarization
Generates concise summaries for lengthy academic papers.
Content Management
Document Title Generation
Automatically generates meaningful titles for documents.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase