F

Flan T5 Large Stacked Samsum 1024

Developed by stacked-summaries
A summary generation model fine-tuned on the stacked-samsum-1024 dataset based on google/flan-t5-large, using stacked summarization method to improve information extraction capability
Downloads 16
Release Time : 12/6/2022

Model Overview

This model is trained with the stacked summarization method, enabling better identification and extraction of key information from text, especially suitable for dialogue summarization tasks. The model uses [NEXT_CONCEPT] tokens to separate different concepts for easier information segmentation.

Model Features

Stacked Summarization Training
Trained with stacked summarization method, enabling the model to better identify and separate key concepts in text
Concept Segmentation Tokens
Uses [NEXT_CONCEPT] tokens to automatically segment different concepts in the output summary
Efficient Information Extraction
Focuses on extracting and condensing key information from dialogue text rather than simply mimicking summarization styles

Model Capabilities

Dialogue summarization generation
Multi-concept information extraction
Text condensation

Use Cases

Dialogue Processing
Customer Service Dialogue Summarization
Automatically generates concise summaries of customer service dialogues, highlighting key issues and solutions
ROUGE-1 score 47.6682, ROUGE-L score 39.7678
Meeting Minutes Condensation
Condenses lengthy meeting minutes into key decision points and action items
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase