Bert Large Portuguese Cased
BERTimbau Large is a pre-trained BERT model for Brazilian Portuguese, achieving state-of-the-art performance on multiple downstream NLP tasks.
Downloads 298.31k
Release Time : 3/2/2022
Model Overview
This model is a Portuguese pre-trained language model based on the BERT architecture, specifically optimized for Brazilian Portuguese, suitable for various natural language processing tasks.
Model Features
Optimized for Brazilian Portuguese
Specifically pre-trained and optimized for Brazilian Portuguese
Large-scale Pretraining
Pre-trained on the brWaC corpus at a large scale
Excellent Multi-task Performance
Achieves state-of-the-art performance on multiple downstream NLP tasks
Model Capabilities
Text Embedding
Masked Language Modeling
Named Entity Recognition
Sentence Similarity Calculation
Textual Entailment Recognition
Use Cases
Natural Language Processing
Named Entity Recognition
Identify entities such as person names, locations, and organizations in text
Achieves state-of-the-art performance levels
Text Similarity Calculation
Calculate semantic similarity between two pieces of text
Performs excellently on evaluation datasets
Featured Recommended AI Models
Š 2025AIbase