G

Gemma 3 4b It Q4 K M GGUF

Developed by Aldaris
The GGUF quantized version of the Gemma 3 4B model, suitable for local inference.
Downloads 190
Release Time : 3/14/2025

Model Overview

This is the GGUF format conversion version of the google/gemma-3-4b-it model, mainly used for text generation tasks.

Model Features

GGUF Format
The quantized model format, suitable for efficient operation on local devices.
Local Inference
Can be run locally through llama.cpp without cloud services.
Quantized Version
The Q4_K_M quantization balances the model size and inference quality.

Model Capabilities

Text Generation
Dialogue System
Content Creation

Use Cases

Content Generation
Creative Writing
Generate creative content such as stories and poems.
Q&A System
Answer various questions raised by users.
Development Assistance
Code Generation
Assist in generating code snippets.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase