R

RWKV7 Goose World3 1.5B HF

Developed by RWKV
The RWKV-7 model in flash-linear attention format, supporting English text generation tasks.
Downloads 70
Release Time : 3/17/2025

Model Overview

This is a 1.47 billion-parameter language model based on the RWKV-7 architecture, utilizing flash-linear attention format, primarily designed for English text generation tasks.

Model Features

Flash Linear Attention
Utilizes flash-linear attention format to enhance model efficiency.
Dynamic State Evolution
Features expressive dynamic state evolution, improving model expressiveness.
Efficient Training
Trained on the Pile dataset, utilizing a total of 332 billion tokens.

Model Capabilities

Text Generation

Use Cases

Text Generation
General Text Generation
Can be used to generate various types of English text content.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase