N

Nemotron H 47B Base 8K

Developed by nvidia
The NVIDIA Nemotron-H-47B-Base-8K is a large language model (LLM) developed by NVIDIA, designed for text completion tasks. It features a hybrid architecture primarily composed of Mamba-2 and MLP layers, with only five attention layers.
Downloads 1,242
Release Time : 4/8/2025

Model Overview

Nemotron-H-47B-Base-8K is a large language model supporting an 8K context length, suitable for text generation tasks. It supports multiple languages, including English, German, Spanish, French, Italian, Korean, Portuguese, Russian, Japanese, and Chinese.

Model Features

Hybrid Architecture
Utilizes a hybrid architecture of Mamba-2 and MLP layers with only five attention layers, improving model efficiency and performance.
Multilingual Support
Supports 10 languages, including English, German, Spanish, French, Italian, Korean, Portuguese, Russian, Japanese, and Chinese.
8K Context Length
Supports up to 8K context length, making it suitable for long-text processing tasks.
Efficient Training
Pruned and distilled from Nemotron-H-56B-Base-8K using 63 billion tokens, optimizing training efficiency.

Model Capabilities

Text Generation
Multilingual Text Completion
Long-Context Processing

Use Cases

Research and Development
Large Language Model Research
Used in research projects for building and optimizing large language models.
Supports multiple languages and long-context processing.
Text Generation Tasks
Used for generating and completing text content, such as articles and dialogues.
Produces high-quality multilingual text.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase