Tibert Base
T
Tibert Base
Developed by fgaim
This is a BERT base model pretrained specifically for Tigrinya, trained for 40 epochs on a dataset of 40 million tokens.
Downloads 28
Release Time : 3/2/2022
Model Overview
This model is a monolingual pretrained language model specifically designed for Tigrinya, based on the BERT architecture, suitable for various natural language processing tasks.
Model Features
Large-scale Pretraining
Pretrained on a 40-million-token Tigrinya dataset
Full BERT Architecture
Adopts standard BERT architecture with 12 layers and 12 attention heads
TPU-optimized Training
Efficient training using TPU v3.8, available in both Flax and PyTorch versions
Model Capabilities
Masked language prediction
Semantic understanding
Word vector representation
Use Cases
Natural Language Processing
Text Completion
Can be used for automatic text completion tasks in Tigrinya
As shown in examples, can accurately predict vocabulary at [MASK] positions
Semantic Analysis
Can be used for semantic understanding and analysis of Tigrinya texts
Featured Recommended AI Models
Š 2025AIbase