Distilbart Ftn Wiki Lingua
D
Distilbart Ftn Wiki Lingua
Developed by datien228
This is a BART model fine-tuned on the English WikiLingua dataset, primarily used for research on summarization task performance.
Downloads 20
Release Time : 7/3/2022
Model Overview
This model is based on the BART architecture and fine-tuned on the WikiLingua dataset for generating short summaries. It is mainly intended for research purposes to explore the performance of pre-trained models in summarization tasks.
Model Features
Fine-tuned on WikiLingua
Optimized specifically for the WikiLingua dataset, suitable for generating short summaries.
Inherits BART advantages
Retains the sequence-to-sequence transformation capabilities of the original BART model.
Research-oriented
Primarily used for studying the performance of fine-tuned models in summarization tasks.
Model Capabilities
Text summarization
English text processing
Use Cases
Research applications
Summarization performance research
Used to study the performance of pre-trained models in summarization tasks.
The model tends to extract opening sentences as summaries, with room for improvement in capturing key points.
Text processing
English document summarization
Generates short summaries for English documents.
Featured Recommended AI Models
Š 2025AIbase