Bert2bert Shared Turkish Summarization
This is a BERT2BERT model fine-tuned for Turkish news summarization tasks, using a shared parameter architecture and trained on the MLSUM Turkish dataset.
Downloads 613
Release Time : 3/2/2022
Model Overview
This model is specifically designed for generating summaries of Turkish news texts, based on the shared-parameter BERT2BERT architecture and fine-tuned on the MLSUM Turkish dataset.
Model Features
Shared Parameter Architecture
Uses a BERT2BERT architecture with shared encoder-decoder parameters, improving model efficiency.
Turkish Language Optimization
Fine-tuned based on the Turkish pre-trained model dbmdz/bert-base-turkish-cased.
News Summarization Specialization
Optimized specifically for news text summarization tasks and trained on the MLSUM Turkish dataset.
Model Capabilities
Turkish Text Understanding
News Text Summarization Generation
Long Text Compression
Use Cases
News Media
Automatic News Summarization
Automatically generates concise summaries for Turkish news articles.
Rouge2 F1 score reaches 29.48
Content Analysis
News Content Analysis
Extracts key information from large volumes of Turkish news.
Featured Recommended AI Models