R

RWKV7 Goose World3 2.9B HF GGUF

Developed by Mungert
RWKV-7 model based on flash-linear attention format, supporting multilingual text generation tasks.
Downloads 14.51k
Release Time : 3/21/2025

Model Overview

This is a 2.9 billion parameter RWKV-7 model with flash-linear attention architecture, supporting multiple languages including English, Chinese, Japanese, Korean, French, Arabic, Spanish, and Portuguese.

Model Features

Multilingual support
Supports text generation in 8 languages, including major languages like English and Chinese.
Efficient architecture
Utilizes flash-linear attention architecture to improve computational efficiency.
Multiple quantization options
Offers various model formats from BF16 to ultra-low-bit quantization to meet different hardware requirements.

Model Capabilities

Multilingual text generation
Dialogue systems
Content creation

Use Cases

Dialogue systems
Multilingual chatbot
Build intelligent dialogue systems supporting multiple languages
Content creation
Multilingual content generation
Automatically generate marketing copy or articles in multiple languages
Featured Recommended AI Models
ยฉ 2025AIbase