B

Bigbird Pegasus Large Pubmed

Developed by google
BigBirdPegasus is a Transformer model based on sparse attention, capable of handling longer sequences, especially suitable for long document summarization tasks.
Downloads 2,031
Release Time : 3/2/2022

Model Overview

BigBirdPegasus is a Transformer model based on sparse attention that extends the capabilities of traditional Transformers, efficiently processing sequences up to 4096 tokens in length. It excels in tasks such as long document summarization.

Model Features

Sparse Attention Mechanism
Uses block sparse attention mechanism to significantly reduce computational costs for long sequence processing.
Long Sequence Processing Capability
Efficiently processes sequences up to 4096 tokens, making it suitable for long document tasks.
High-Performance Summarization
Achieves excellent ROUGE scores in scientific paper summarization tasks.

Model Capabilities

Long Document Summarization
Scientific Paper Summarization

Use Cases

Academic Research
PubMed Paper Summarization
Generates concise and accurate summaries for PubMed scientific papers
ROUGE-1 score 40.8966, ROUGE-2 score 18.1161
arXiv Paper Summarization
Generates summaries for arXiv scientific papers
ROUGE-1 score 40.3815, ROUGE-2 score 14.374
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase