R

Roberta2roberta L 24 Bbc

Developed by google
An encoder-decoder model based on RoBERTa architecture, specifically designed for extreme summarization tasks, fine-tuned on the BBC XSum dataset.
Downloads 959
Release Time : 3/2/2022

Model Overview

This model adopts an encoder-decoder architecture, with both the encoder and decoder initialized based on roberta-large, suitable for generating extremely concise text summaries.

Model Features

Extreme summarization generation
Optimized specifically for generating extremely short summaries, capable of extracting core information from text.
RoBERTa architecture
Initialized based on the powerful RoBERTa-large model, with excellent language understanding capabilities.
BBC XSum fine-tuning
Fine-tuned on a professional news summarization dataset, suitable for processing news texts.

Model Capabilities

Text summarization generation
Extreme text compression
Key information extraction

Use Cases

News processing
News headline generation
Compress long news reports into extremely short summaries or headlines
Generate concise and accurate news highlights
Content summarization
Document summarization
Generate extremely concise summaries for long documents
Highly compressed text retaining core information
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase