AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Biomedical Pretraining

# Biomedical Pretraining

Biomistral 7B DARE
Apache-2.0
BioMistral-7B-mistral7instruct-dare is a large language model for the medical domain, merged using the DARE method from Mistral-7B-Instruct-v0.1 and BioMistral-7B, specializing in biomedical text generation tasks.
Large Language Model Transformers Supports Multiple Languages
B
BioMistral
426
20
Biomedvlp CXR BERT General
MIT
CXR-BERT is a specialized language model developed for the chest X-ray domain, optimized for radiology text processing through improved vocabulary and pretraining procedures
Large Language Model Transformers English
B
microsoft
12.31k
37
Biom ALBERT Xxlarge PMC
Large-scale biomedical language models based on BERT, ALBERT, and ELECTRA, achieving state-of-the-art results in multiple biomedical tasks
Large Language Model Transformers
B
sultan
189
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase