B

Biolinkbert Large

Developed by michiyasunaga
BioLinkBERT is a biomedical language model pre-trained on PubMed abstracts and citation links, enhancing performance through cross-document knowledge integration.
Downloads 3,152
Release Time : 3/8/2022

Model Overview

An improved BERT model that captures cross-document relationships using document links (e.g., citations), optimized for biomedical NLP tasks and achieving SOTA performance on multiple benchmarks.

Model Features

Cross-document Knowledge Integration
Enhances contextual understanding by jointly processing related documents through citation links.
Biomedical Domain Optimization
Pre-trained on PubMed data, specifically designed for biomedical text processing.
Multi-task Adaptability
Supports fine-tuning for various downstream tasks like QA and classification, or direct use for feature extraction.

Model Capabilities

Biomedical Text Understanding
Cross-document Relationship Analysis
Question Answering System Construction
Text Classification
Sequence Labeling
Feature Vector Extraction

Use Cases

Medical Research
Drug Mechanism Analysis
Analyzes text describing drug targets and mechanisms of action.
Achieves 72.2% accuracy on PubMedQA tasks.
Clinical Decision Support
Medical Exam QA
Answers USMLE medical licensing exam questions.
44.6% accuracy on MedQA-USMLE, surpassing same-scale models.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase