T

Tucano 160m

Developed by TucanoBR
Tucano-160m is a decoder-Transformer model natively pre-trained in Portuguese, focusing on the research and development of Portuguese language modeling.
Downloads 1,183
Release Time : 9/20/2024

Model Overview

Tucano-160m is a series of decoder-Transformer models natively pre-trained in Portuguese, mainly used for the research and development of Portuguese language modeling.

Model Features

Native Portuguese pre-training
Tucano-160m is a model specifically natively pre-trained for Portuguese, optimizing the language understanding and generation abilities of Portuguese.
Large-scale dataset training
The model is trained on the GigaVerbo dataset, which contains 200 billion deduplicated Portuguese tokens.
High-performance computing support
The training uses 8 NVIDIA A100 - SXM4 - 80GB GPUs, and the training time is about 44 hours.

Model Capabilities

Portuguese text generation
Language modeling research
Text completion

Use Cases

Research and development
Portuguese language modeling research
Used to study the performance of Portuguese language models and improvement methods.
Text generation experiment
Conduct text generation experiments as a base model to evaluate the impact of different parameters on the generation effect.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase