M

Meditron 70b

Developed by epfl-llm
Meditron-70B is an open-source large language model based on Llama-2-70B through continuous pre-training in the medical field, focusing on medical knowledge encoding and reasoning tasks.
Downloads 214
Release Time : 11/8/2023

Model Overview

A large language model with 70 billion parameters in the medical field, trained on carefully selected medical corpora, outperforming similar models in multiple medical reasoning tasks.

Model Features

Medical Domain Adaptation
Continuous pre-training based on 48.1B tokens of medical professional corpora (clinical guidelines/medical papers)
High Performance
Achieves 71.2% accuracy on the TruthfulQA medical category, surpassing Llama-2-70B (54.8%) and Med42-70B (58.0%)
Long Context Support
4K tokens context window, suitable for processing complex medical documents

Model Capabilities

Medical Q&A generation
Clinical guideline parsing
Medical literature comprehension
Diagnostic assistance reasoning

Use Cases

Clinical Decision Support
Medical Exam Q&A
Answering questions for medical qualification exams such as USMLE
64.4% accuracy on the MedQA test set
Assisted Differential Diagnosis
Generating possible diagnostic suggestions based on symptom descriptions
Medical Information Retrieval
Disease Information Retrieval
Querying medical knowledge such as symptoms/causes/treatment plans
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase