M

Medgemma 27b Text It GGUF

Developed by Mungert
MedGemma-27B-Text-IT is a medical-specific large language model based on the Gemma 3 architecture, optimized for medical text processing and offering multiple quantization versions to adapt to different hardware environments.
Downloads 1,464
Release Time : 5/29/2025

Model Overview

This model focuses on text understanding and generation tasks in the medical field, performs excellently in text benchmark tests of medical knowledge and reasoning, and supports multiple quantization formats to meet different hardware requirements.

Model Features

Ultra-low bit quantization
Supports an accuracy-adaptive quantization method with 1 - 2 bits, reducing accuracy loss while maintaining high memory efficiency.
Optimized for the medical field
Specifically trained for medical text and image understanding, performs excellently in medical benchmark tests.
Support for multiple formats
Provides BF16, F16, and multiple quantization versions, which can be flexibly selected according to hardware capabilities.
Support for long contexts
Supports a context length of at least 128K tokens, suitable for processing complex medical documents.

Model Capabilities

Medical text generation
Medical Q&A
Medical report analysis
Medical knowledge reasoning

Use Cases

Clinical assistance
Differentiation of pneumonia types
Help doctors distinguish between bacterial and viral pneumonia
Achieved an accuracy of 89.8% in the MedQA benchmark test
Medical Q&A
Answer medical-related questions from patients or doctors
Achieved an accuracy of 76.8% in the PubMedQA test
Medical research
Literature analysis
Quickly understand and summarize the content of medical literature
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase