T

Teenytinyllama 160m GGUF

Developed by afrideva
TeenyTinyLlama-160m is a compact language model optimized for Brazilian Portuguese, with 160 million parameters, focusing on low-resource language research.
Downloads 148
Release Time : 5/12/2024

Model Overview

This model is a Transformer-based causal language model primarily designed for Brazilian Portuguese text generation tasks. It serves as an experimental model to study the challenges of developing low-resource language models.

Model Features

Low-resource language optimization
Specifically trained for Brazilian Portuguese, addressing the gap in low-resource language models
Efficient training
Trained in approximately 36 hours using a single NVIDIA A100 GPU, with a carbon emission of only 5.6 kg CO2
Open-source transparency
Full disclosure of training code and process to facilitate research and reproducibility
Lightweight
A compact model with only 160 million parameters, ideal for research and experimental purposes

Model Capabilities

Portuguese text generation
Text completion
Language understanding

Use Cases

Education
ENEM exam question generation
Generating questions in the style of Brazil's National High School Exam (ENEM)
Accuracy 19.24%
OAB legal exam assistance
Generating questions related to the Brazilian Bar Exam
Accuracy 22.37%
Content moderation
Hate speech detection
Identifying hate speech in Portuguese
F1 macro average 36.92-42.63
Language understanding
Textual entailment recognition
Determining logical relationships between Portuguese sentences
F1 macro average 53.97
Semantic similarity calculation
Computing semantic similarity between Portuguese sentences
Pearson coefficient 0.24
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase