M

Minueza 2 96M

Developed by Felladrin
A compact language model based on the Llama architecture, supporting English and Portuguese, with 96 million parameters and a context length of 4096 tokens.
Downloads 357
Release Time : 4/5/2025

Model Overview

A lightweight foundational model trained from scratch, serving as a base for subsequent fine-tuning for specific applications. While its reasoning and knowledge are limited, it is suitable for use in resource-constrained environments.

Model Features

Compact and Efficient
A small model with only 96 million parameters, suitable for running on devices without GPUs or on mobile platforms.
Bilingual Support
Supports text generation in both English and Portuguese.
Long Context Processing
Supports a context window length of 4096 tokens.
Fine-tuning Friendly
Designed to serve as a base model for fine-tuning in ChatML format.

Model Capabilities

Text Generation
Multilingual Processing

Use Cases

Mobile Applications
In-browser Text Generation
Run on mobile browsers via Wllama and Transformers.js
Enables lightweight client-side text generation.
Resource-Constrained Environments
Deployment on Low-Power Devices
Efficiently runs on devices without GPUs
Provides basic language model capabilities for edge devices.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase