L

Linkbert Base

Developed by michiyasunaga
LinkBERT is an improved BERT model pre-trained on English Wikipedia and hyperlink information, enhancing performance on knowledge-intensive tasks by capturing cross-document associations.
Downloads 195
Release Time : 3/8/2022

Model Overview

LinkBERT is a Transformer encoder-based model that strengthens cross-document knowledge associations by incorporating document links (e.g., hyperlinks), suitable for tasks like question answering and text classification.

Model Features

Cross-document link pre-training
Incorporates related documents into the same context using hyperlinks to enhance knowledge associations
BERT architecture compatible
Can directly replace BERT without modifying downstream task code
Optimized for knowledge-intensive tasks
Significantly outperforms original BERT in tasks like question answering and reading comprehension

Model Capabilities

Text feature extraction
Masked language modeling
Question answering system construction
Text classification
Sequence labeling

Use Cases

Knowledge-intensive tasks
Open-domain question answering
Leverages cross-document associations to answer complex questions
2.2 points higher than BERT-base on HotpotQA
Document retrieval
Document relevance ranking based on link relationships
General NLP tasks
Text classification
Sentiment analysis, topic classification, etc.
0.4 points higher than BERT-base on GLUE average score
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase