M

Minueza 32M Chat

Developed by Felladrin
Minueza-32M-Chat is a chat model with 32 million parameters, based on Felladrin/Minueza-32M-Base and trained with supervised fine-tuning (SFT) and direct preference optimization (DPO).
Downloads 77
Release Time : 2/25/2024

Model Overview

This is a small yet efficient chat model suitable for various dialogue scenarios, capable of providing helpful responses and suggestions.

Model Features

Compact and Efficient
With only 32 million parameters, it achieves decent dialogue capabilities through meticulous training
Multi-dataset Training
Trained using multiple high-quality datasets including Dolly, WebGLM, Capybara, etc.
Direct Preference Optimization
Utilizes DPO training method to optimize response quality

Model Capabilities

Text Generation
Dialogue Interaction
Q&A System
Creative Writing
Career Counseling
Health Advice

Use Cases

Dialogue Systems
Career Counseling
Provides career development advice and guidance to users
Offers personalized career suggestions based on user skills and interests
Knowledge Q&A
Health Advice
Answers questions about healthy lifestyles
Provides common-sense health improvement suggestions
Creative Generation
Game Setting Creation
Generates fantasy game settings based on user requests
Creates imaginative game worlds and characters
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase