B

Biom ALBERT Xxlarge PMC

Developed by sultan
Large-scale biomedical language models based on BERT, ALBERT, and ELECTRA, achieving state-of-the-art results in multiple biomedical tasks
Downloads 189
Release Time : 3/2/2022

Model Overview

BioM-Transformers is a series of Transformer models optimized for the biomedical domain, demonstrating exceptional performance in biomedical text processing tasks through various architectural choices. The models are pretrained on PMC full-text data and support multiple biomedical NLP tasks.

Model Features

Multi-architecture Support
Provides variants based on BERT, ALBERT, and ELECTRA architectures to meet different application scenario requirements
Efficient TPU Support
Includes PyTorch XLA and JAX/Flax implementations, allowing fine-tuning using free TPU resources from Google Colab and Kaggle
Biomedical Domain Optimization
Additional 64k steps of pretraining on PMC full-text data, specifically optimized for biomedical text characteristics
Computational Efficiency
Achieves superior performance compared to similar models at equal or lower computational costs

Model Capabilities

Biomedical text classification
Biomedical named entity recognition
Biomedical question answering systems
Biomedical relation extraction

Use Cases

Biomedical Literature Processing
ChemProt Relation Classification
Chemical-protein interaction classification task
Micro-average F1 score 80.74 (5-epoch fine-tuning took 43 minutes)
BioASQ Biomedical QA
Answering fact-based questions in the biomedical domain
Clinical Text Analysis
Clinical Named Entity Recognition
Identifying medical entities in clinical texts
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase