Mbart Large 50 Finetuned V1
M
Mbart Large 50 Finetuned V1
Developed by z-rahimi-r
A fine-tuned model based on the mbart-large-50 architecture, suitable for multilingual summarization tasks
Downloads 14
Release Time : 8/11/2022
Model Overview
This model is a fine-tuned version based on the mbart-large-50 architecture, primarily used for text summarization tasks. Although the specific training dataset is not explicitly stated, the mbart series of models typically supports 50 languages.
Model Features
Multilingual support
Based on the mbart-large-50 architecture, likely supporting text processing capabilities in 50 languages
Summarization generation
Specially fine-tuned and optimized for text summarization tasks
Efficient training
Uses Adam optimizer and linear learning rate scheduler for efficient training
Model Capabilities
Text summarization generation
Multilingual text processing
Use Cases
Text processing
News summarization
Automatically generate concise summaries of news articles
Multilingual document summarization
Process documents in multiple languages and generate summaries
Featured Recommended AI Models
Š 2025AIbase