M

Minueza 32M Base

Developed by Felladrin
Minueza-32M-Base is a base model with 32 million parameters, fully trained on extensive English text corpora, suitable for text generation tasks.
Downloads 68
Release Time : 2/25/2024

Model Overview

This model is a base pre-trained model primarily for text generation tasks, adaptable to various specific application scenarios through fine-tuning.

Model Features

Lightweight design
Compact 32M parameter design suitable for fast operation on non-GPU devices
Mobile compatibility
Can run on mobile browsers via Transformers.js
ChatML format support
Includes special tokens supporting ChatML format for easy dialogue model fine-tuning
Multi-dataset training
Trained using 9 high-quality datasets with interleaved sampling, approximately 650 million tokens

Model Capabilities

Text generation
Basic language understanding
Instruction following (requires fine-tuning)
Dialogue generation (requires fine-tuning)

Use Cases

Game content generation
Fantasy world setting generation
Generate background content like worldviews and character settings for fantasy games
Can generate coherent fantasy world descriptions
Educational content generation
Textbook content generation
Generate educational content based on open textbook data
Environmental content generation
Waste management suggestions
Generate practical suggestions for reducing waste production
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase