T

Tswanabert

Developed by MoseliMotsoehli
A Tswana language model pretrained with the Masked Language Modeling (MLM) objective.
Downloads 42
Release Time : 3/2/2022

Model Overview

Tswana BERT is a Transformer model pretrained in a self-supervised manner on a Tswana corpus, which masks portions of input vocabulary and uses byte-level tokenization to predict the masked content.

Model Features

Tswana-Specific
A BERT model specifically optimized and trained for the Tswana language.
Self-Supervised Learning
Pretrained using the Masked Language Modeling task.
Byte-Level Tokenization
Processes input text using byte-level tokenization.

Model Capabilities

Masked Word Prediction
Tswana Text Understanding
Downstream Task Fine-Tuning

Use Cases

Natural Language Processing
Text Completion
Predicts masked Tswana vocabulary.
Examples demonstrate accurate prediction of everyday Tswana phrases.
Language Model Fine-Tuning
Can serve as a base model for downstream NLP tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase