A

Albertina 1b5 Portuguese Ptbr Encoder

Developed by PORTULAN
Albertina 1.5B PTBR is a foundational large language model for the Brazilian Portuguese variant. It is an encoder belonging to the BERT family, based on the Transformer neural network architecture and developed on the basis of the DeBERTa model.
Downloads 83
Release Time : 10/27/2023

Model Overview

This is a large language model specifically designed for the Brazilian Portuguese variant, with 1.5 billion parameters, and has the most competitive performance for this language.

Model Features

Optimized for Brazilian Portuguese
Specifically trained and optimized for the Brazilian Portuguese variant
Large-scale parameters
With 1.5 billion parameters, it sets a new technological benchmark for Brazilian Portuguese
High performance
Shows the most competitive performance on Brazilian Portuguese tasks
Open license
Freely distributed under the most permissive MIT license

Model Capabilities

Text understanding
Masked language modeling
Brazilian Portuguese text processing

Use Cases

Natural language processing
Text completion
Automatically complete masked text segments
In the example, 'tradition' was correctly predicted as the best completion word
Language understanding
Understand the semantics and context of Brazilian Portuguese text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase