D

Dialogled Large 5120

Developed by MingZhong
DialogLM is a pretrained model based on the Longformer Encoder-Decoder (LED) architecture, specifically designed for long dialogue understanding and summarization tasks.
Downloads 441
Release Time : 3/2/2022

Model Overview

DialogLM is a pretrained model for long dialogue understanding and summarization, employing a window-based denoising pretraining strategy and supporting input sequences of up to 5,120 tokens.

Model Features

Long Dialogue Support
Supports input sequences of up to 5,120 tokens, making it suitable for long dialogue scenarios.
Denoising Pretraining Strategy
Employs a window-based denoising pretraining strategy to enhance performance in long dialogue tasks.
Encoder-Decoder Architecture
Based on the LED architecture, capable of handling both dialogue understanding and summarization generation tasks.

Model Capabilities

Long Dialogue Understanding
Dialogue Summarization Generation

Use Cases

Dialogue Systems
Customer Service Dialogue Summarization
Automatically generates summaries of customer service dialogues for subsequent analysis and processing.
Meeting Minutes Summarization
Summarizes lengthy meeting dialogues to extract key information.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase