N

NASES

Developed by ELiRF
The first monolingual Transformer model specifically designed for Spanish news abstract generation, enhancing abstractiveness through specialized pre-training, outperforming mainstream multilingual models.
Downloads 14
Release Time : 3/2/2022

Model Overview

An encoder-decoder model based on Transformer architecture, specifically designed for Spanish news abstract generation, optimized for generation quality through four self-supervised pre-training tasks.

Model Features

Monolingual Specialization
Designed for Spanish language characteristics, outperforming multilingual models (e.g., mBART/mT5)
Enhanced Abstractiveness
Significantly improves summary reconstruction capabilities through four self-supervised pre-training tasks (e.g., sentence reordering/text infilling)
Novel Evaluation Metric
Proposes a 'content reorganization' metric specifically for evaluating semantic reconstruction in summary generation

Model Capabilities

News summarization generation
Semantic content reconstruction
Spanish text comprehension

Use Cases

Media industry
News briefing generation
Automatically generates key-point summaries for Spanish news articles
Produces more abstractive summaries than traditional extractive methods
Corporate analysis
Public opinion monitoring
Extracts key information from Spanish social media
Supports sentiment analysis for low-resource languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase