G

Gebert Eng Gat

Developed by andorei
GEBERT is a biomedical entity linking model based on PubMedBERT pre-training, enhanced with UMLS concept graph representation learning through a GAT graph encoder
Downloads 41
Release Time : 9/15/2023

Model Overview

This model combines the BERT architecture with Graph Attention Networks (GAT), specifically optimized for biomedical entity linking tasks, leveraging UMLS knowledge graphs to improve concept representation quality

Model Features

Graph Structure Enhancement
Integrates topological relationships of UMLS concept graphs through GAT encoder to enhance contextual representation of biomedical concepts
Domain-Adaptive Pre-training
Secondary pre-training based on PubMedBERT, specifically optimized for biomedical text characteristics
Multimodal Knowledge Fusion
Simultaneously utilizes concept name textual information and knowledge graph structural information for joint training

Model Capabilities

Biomedical Entity Recognition
Concept Normalization
Knowledge Graph Embedding
Cross-Document Entity Linking

Use Cases

Medical Literature Processing
Electronic Medical Record Standardization
Links non-standard terms in clinical records to UMLS standard concepts
Improves record structuring and retrieval efficiency
Biomedical Literature Mining
Extracts normalized disease/drug entities from research papers
Supports large-scale biomedical relationship analysis
Knowledge Graph Construction
Graph Entity Alignment
Resolves entity heterogeneity issues across different biomedical knowledge sources
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase