R

Rwkv7 0.1B G1

Developed by fla-hub
The RWKV-7 g1 model based on the Flash linear attention mechanism, supporting multilingual processing and having deep thinking ability
Downloads 377
Release Time : 3/10/2025

Model Overview

This is a multilingual large language model with 191 million parameters, adopting the RWKV7 architecture, supporting multiple languages such as English and Chinese, having deep thinking ability, and suitable for tasks such as text generation.

Model Features

Multilingual support
Supports the processing of multiple languages such as English, Chinese, Japanese, Korean, French, Arabic, Spanish, and Portuguese
Deep thinking ability
The g1 model series incorporates deep thinking ability and can generate higher-quality text
Efficient attention mechanism
Adopts the Flash linear attention mechanism to improve model efficiency

Model Capabilities

Multilingual text generation
Dialogue system
Content creation

Use Cases

Dialogue system
Intelligent assistant
Used to build a multilingual intelligent dialogue assistant
Can generate coherent and logical dialogue responses
Content creation
Multilingual content generation
Generates content such as news and stories in various languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase