A

Albertina 100m Portuguese Ptpt Encoder

Developed by PORTULAN
Albertina 100M PTPT is a foundational large language model for European Portuguese (Portugal), belonging to the BERT family of encoders. It is based on the Transformer neural network architecture and developed upon the DeBERTa model.
Downloads 171
Release Time : 5/25/2023

Model Overview

This model is optimized for European Portuguese, featuring 100 million parameters, and is suitable for tasks such as masked language modeling.

Model Features

Optimized for European Portuguese
Specifically trained and optimized for European Portuguese (PT-PT)
Based on DeBERTa Architecture
Developed upon the DeBERTa model, featuring improved attention mechanisms
Permissive License
Uses the MIT license, allowing free use and distribution

Model Capabilities

Masked Language Modeling
Text Understanding
Contextual Prediction

Use Cases

Natural Language Processing
Text Completion
Predicts masked words in sentences
As shown in examples, it can accurately predict words related to Portuguese cuisine
Language Understanding
Understands the semantics of European Portuguese text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase