D

Dialogled Base 16384

Developed by MingZhong
DialogLM is a pre-trained model based on the Longformer-Encoder-Decoder (LED) architecture, specifically designed for long dialogue understanding and summarization tasks.
Downloads 566
Release Time : 3/2/2022

Model Overview

DialogLM is a pre-trained model for long dialogue understanding and summarization, employing window-based denoising tasks as pre-training objectives and supporting input sequences of up to 16,384 tokens.

Model Features

Long Dialogue Support
Supports input sequences of up to 16,384 tokens, making it suitable for long dialogue scenarios.
Window-Based Denoising Task
Uses window-based denoising tasks as pre-training objectives to enhance the model's understanding of long dialogues.
Massive Data Training
Leverages extensive long dialogue data for deep training to improve the model's generalization capabilities.

Model Capabilities

Long Dialogue Understanding
Dialogue Summarization Generation

Use Cases

Dialogue Systems
Customer Service Dialogue Summarization
Automatically generates summaries of customer service dialogues to help quickly grasp the conversation content.
Meeting Minutes Summarization
Summarizes lengthy meeting dialogues to extract key information.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase