B

Biomistral 7B

Developed by BioMistral
BioMistral is an open-source large language model optimized for the medical domain based on the Mistral architecture, further pre-trained on PubMed Central open-access text data, supporting multilingual medical question-answering tasks.
Downloads 22.59k
Release Time : 2/14/2024

Model Overview

An open-source pre-trained large language model collection for the biomedical domain, excelling in multiple medical QA benchmarks and supporting 8 languages including English.

Model Features

Medical Domain Optimization
Further pre-trained on PubMed Central open-access text data, specifically optimized for biomedical domain knowledge.
Multilingual Support
Supports medical QA tasks in 8 languages, including major European languages.
Model Fusion Techniques
Provides three model fusion versions (DARE/TIES/SLERP) to enhance model performance.
Quantized Versions
Offers multiple quantized versions (AWQ/BnB), significantly reducing VRAM requirements while maintaining good performance.

Model Capabilities

Medical Text Generation
Medical Question Answering
Multilingual Medical Information Processing
Biomedical Knowledge Retrieval

Use Cases

Medical Education
Medical Knowledge QA
Answering professional questions in clinical medicine, anatomy, and other specialized fields.
Achieved an average score of 57.3 on 10 English medical QA benchmarks, outperforming the base Mistral model.
Medical Research
Literature Information Extraction
Extracting key information from medical literature.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase