G

Granite 4.0 Tiny Base Preview

Developed by ibm-granite
Granite-4.0-Tiny-Base-Preview is a 7-billion parameter Mixture of Experts (MoE) language model developed by IBM, featuring a 128k token context window and enhanced expressive capabilities through Mamba-2 technology.
Downloads 156
Release Time : 4/30/2025

Model Overview

This model is a multilingual large language model suitable for text generation, information extraction, and other tasks, serving as a foundational model that can be fine-tuned for specific scenarios.

Model Features

Extended context processing
Supports a 128k token context window, ideal for processing long documents and understanding complex contexts
Mixture of Experts architecture
Utilizes MoE architecture to improve model efficiency while reducing computational resource consumption without compromising performance
Multilingual support
Natively supports 12 languages and can be fine-tuned for additional languages
No positional encoding design
Employs NoPE technology for better length generalization capabilities

Model Capabilities

Text generation
Text summarization
Information extraction
Question-answering systems
Multilingual processing
Long document comprehension

Use Cases

Content generation
Automatic summarization
Generates concise and accurate summaries of long documents
Multilingual content creation
Generates marketing copy, product descriptions, and other content in multiple languages
Information processing
Document question-answering system
Extracts precise answers from long documents
Knowledge extraction
Extracts structured information from unstructured text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase