Bertimbau
A BERT model pre-trained for Brazilian Portuguese, excelling in various NLP tasks
Downloads 38
Release Time : 8/21/2023
Model Overview
BERTimbau Base is a Portuguese pre-trained model based on BERT architecture, specifically optimized for Brazilian Portuguese. This model achieves top performance in tasks such as named entity recognition, sentence similarity, and textual entailment.
Model Features
Optimized for Brazilian Portuguese
Specifically pre-trained and optimized for the Brazilian Portuguese variant
Excellent Multi-task Performance
Outstanding performance in various NLP tasks including named entity recognition, sentence similarity, and textual entailment
Two Size Options
Available in base version (110M parameters) and large version (335M parameters)
Model Capabilities
Masked language modeling
Text embedding
Named entity recognition
Sentence similarity calculation
Textual entailment
Use Cases
Natural Language Processing
Text Completion
Predict masked words in sentences
Example shows accurate prediction of 'pedra' (stone) in 'Tinha uma [MASK] no meio do caminho.'
Semantic Analysis
Obtain semantic embeddings of text
Can generate 768-dimensional text embedding vectors
Featured Recommended AI Models