Bert Base Multilingual Cased Finetuned Amharic
B
Bert Base Multilingual Cased Finetuned Amharic
Developed by Davlan
Amharic language model fine-tuned based on multilingual BERT, achieved by vocabulary replacement and fine-tuning, outperforming the original multilingual model in NER tasks
Downloads 196
Release Time : 3/2/2022
Model Overview
This model is a BERT model fine-tuned on Amharic corpus, specifically optimized for Amharic text processing tasks, with excellent performance in Named Entity Recognition tasks
Model Features
Amharic-specific vocabulary
Replaced the original mBERT's unsupported Amharic vocabulary, significantly improving the understanding of Amharic text
NER task performance advantage
Achieved an F1 score of 60.89 on the MasakhaNER dataset, significantly outperforming the original multilingual BERT model
Efficient fine-tuning
Efficient fine-tuning based on the pre-trained multilingual BERT model, saving training resources
Model Capabilities
Amharic text understanding
Masked word prediction
Named Entity Recognition
Use Cases
Natural Language Processing
Amharic text analysis
Processing Amharic news articles and other text content
Achieved an F1 score of 60.89 in NER tasks
Language model applications
Used for tasks such as masked word prediction in Amharic
Featured Recommended AI Models
Š 2025AIbase