Bert2bert Shared Spanish Finetuned Summarization
This is a text summarization model fine-tuned on the Spanish BERT model (BETO), specifically designed for automatic summarization of Spanish texts.
Downloads 3,185
Release Time : 3/2/2022
Model Overview
The model adopts the BERT2BERT architecture, based on the dccuchile/bert-base-spanish-wwm-cased pre-trained model, fine-tuned on the MLSUM Spanish summarization dataset, capable of generating high-quality Spanish text summaries.
Model Features
Spanish language optimization
Fine-tuned on the Spanish BERT model (BETO), specifically optimized for Spanish text summarization tasks
Large-scale dataset training
Trained on the MLSUM Spanish summarization dataset, containing a large number of news and summary paired samples
Shared-weight architecture
Adopts the BERT2BERT shared-weight architecture, where the encoder and decoder use the same pre-trained model
Model Capabilities
Spanish text summarization generation
Automatic news content summarization
Long text compression
Use Cases
News media
News summarization generation
Automatically generates concise summaries of Spanish news articles
Rouge2 F-score reaches 8.7
Content management
Document summarization
Generates key point summaries for long Spanish documents
Featured Recommended AI Models