R

RWKV7 Goose World3 2.9B HF

Developed by RWKV
The RWKV-7 model adopts the flash linear attention format, supports multilingual text generation tasks, and has a parameter count of 2.9 billion.
Downloads 132
Release Time : 3/17/2025

Model Overview

This is a large language model based on the RWKV-7 architecture, utilizing flash linear attention technology to support text generation in multiple languages, including Chinese.

Model Features

Flash Linear Attention
Utilizes an innovative flash linear attention format to enhance model efficiency.
Multilingual Support
Supports text generation in 8 languages, including Chinese.
Large-Scale Training
Trained on the World v3 dataset, with a total token count of 3.119 trillion.

Model Capabilities

Multilingual Text Generation
Dialogue System Construction
Content Creation

Use Cases

Dialogue System
Intelligent Assistant
Build multilingual intelligent dialogue assistants.
Content Generation
Multilingual Content Creation
Generate text content in multiple languages.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase