# Medical Education
Medalpaca 13b
CC
MedAlpaca 13b is a large language model specifically fine-tuned for medical domain tasks, based on the LLaMA architecture with 13 billion parameters, designed to enhance performance in medical Q&A and dialogue tasks.
Large Language Model
Transformers English

M
medalpaca
558
86
Medalpaca 7b
CC
A 7-billion-parameter language model optimized for the medical field, fine-tuned based on the LLaMA architecture, excelling in medical Q&A and dialogue tasks
Large Language Model
Transformers English

M
medalpaca
9,352
78
Featured Recommended AI Models