B

Biom ALBERT Xxlarge

Developed by sultan
Large-scale biomedical language model based on BERT, ALBERT, and ELECTRA, specialized for biomedical domain tasks
Downloads 77
Release Time : 3/2/2022

Model Overview

BioM-Transformers are pretrained models adapted for the biomedical domain through various design strategies of large Transformer models, achieving state-of-the-art performance levels in multiple biomedical tasks

Model Features

Biomedical Domain Specialization
Pretrained specifically on PubMed abstracts using a domain-specific vocabulary for biomedicine
Multi-architecture Support
Provides different architecture variants based on BERT, ALBERT, and ELECTRA
Efficient Training
Trained efficiently on TPUv3-512 units with a large batch size of 8192
Resource-friendly
Offers PyTorch XLA implementation, supporting execution on free TPU resources

Model Capabilities

Biomedical text classification
Named Entity Recognition
Question Answering System
Fact-based question answering
Biomedical relation extraction

Use Cases

Biomedical Research
Chemical-Protein Relation Identification
Classifying chemical-protein relations on the ChemProt dataset
Achieved state-of-the-art performance
Biomedical Named Entity Recognition
Identifying specialized terms and entities in biomedical texts
Medical QA Systems
SQuAD2.0 QA
Processing standard QA datasets
BioASQ7B Fact-based QA
Answering fact-based questions in the biomedical domain
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase