B

Biomistral 7B DARE AWQ QGS128 W4 GEMM

Developed by BioMistral
An open-source large language model for the medical domain based on the Mistral architecture, further pre-trained on PubMed Central texts
Downloads 135
Release Time : 2/17/2024

Model Overview

BioMistral is an optimized open-source large language model suite for the healthcare domain, based on the Mistral-7B architecture, further pre-trained using open-access texts from PubMed Central, demonstrating excellent performance in medical Q&A tasks.

Model Features

Medical Domain Specialization
Enhanced biomedical knowledge understanding through further pre-training on high-quality medical literature from PubMed Central
Multi-model Fusion Strategy
Provides three fusion variant models (DARE/TIES/SLERP) to improve model performance
Efficient Quantized Versions
Supports various quantization methods like AWQ/BnB, requiring as little as 4.68GB VRAM to run
Multilingual Evaluation Benchmark
First large-scale multilingual (8 languages) LLM evaluation in the medical domain

Model Capabilities

Medical Q&A
Biomedical Text Understanding
Medical Knowledge Reasoning
Multilingual Medical Text Processing

Use Cases

Medical Education
Medical Exam Question Answering
Answering medical exam questions like USMLE
Achieves higher accuracy than most open-source medical models on benchmarks like MedQA
Clinical Research Support
Medical Literature Analysis
Assisting researchers in quickly understanding medical literature content
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase