L

Linkbert Large

Developed by michiyasunaga
LinkBERT-large is an improved BERT model pre-trained on English Wikipedia and book corpora, enhancing cross-document knowledge understanding by integrating document link information.
Downloads 2,042
Release Time : 3/8/2022

Model Overview

This model improves the traditional BERT architecture by capturing hyperlink relationships between documents, excelling in knowledge-intensive tasks (e.g., question answering) and cross-document tasks, and can directly replace BERT.

Model Features

Cross-document link modeling
Innovatively incorporates related documents into pre-training contexts, capturing cross-document knowledge relationships through hyperlinks.
Knowledge-enhanced representations
Trained on structured Wikipedia data, generating text embeddings rich in entity relationships.
BERT ecosystem compatibility
Can directly replace existing BERT models without modifying downstream task architectures.

Model Capabilities

Text feature extraction
Masked language modeling
Question answering system construction
Text classification
Sequence labeling

Use Cases

Knowledge-intensive tasks
Open-domain question answering
Handles complex questions requiring cross-document knowledge.
Achieves an F1 score of 80.8 on HotpotQA, surpassing BERT-large's 78.1.
Information retrieval
Document association analysis
Enhances document similarity calculations using link information.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase