🚀 xlm-roberta-base-finetuned-hausa
The xlm-roberta-base-finetuned-hausa is a Hausa RoBERTa model that offers better performance in text classification and named - entity recognition compared to the XLM - RoBERTa.
🚀 Quick Start
This model can be used with Transformers pipeline for masked token prediction. You can follow the code example below to start using it.
✨ Features
- Enhanced Performance: It provides better performance than the XLM - RoBERTa on text classification and named entity recognition datasets.
- Fine - tuned on Hausa: This model is a xlm - roberta - base model fine - tuned on Hausa corpus.
💻 Usage Examples
Basic Usage
You can use this model with Transformers pipeline for masked token prediction.
>>> from transformers import pipeline
>>> unmasker = pipeline('fill - mask', model='Davlan/xlm - roberta - base - finetuned - hausa')
>>> unmasker("Shugaban <mask> Muhammadu Buhari ya amince da shawarar da ma’aikatar sufuri karkashin jagoranci")
[{'sequence': '<s> Shugaban kasa Muhammadu Buhari ya amince da shawarar da ma’aikatar sufuri karkashin jagoranci</s>',
'score': 0.8104371428489685,
'token': 29762,
'token_str': '▁kasa'},
{'sequence': '<s> Shugaban Najeriya Muhammadu Buhari ya amince da shawarar da ma’aikatar sufuri karkashin jagoranci</s>', 'score': 0.17371904850006104,
'token': 49173,
'token_str': '▁Najeriya'},
{'sequence': '<s> Shugaban kasar Muhammadu Buhari ya amince da shawarar da ma’aikatar sufuri karkashin jagoranci</s>', 'score': 0.006917025428265333,
'token': 21221,
'token_str': '▁kasar'},
{'sequence': '<s> Shugaban Nigeria Muhammadu Buhari ya amince da shawarar da ma’aikatar sufuri karkashin jagoranci</s>', 'score': 0.005785710643976927,
'token': 72620,
'token_str': '▁Nigeria'},
{'sequence': '<s> Shugaban Kasar Muhammadu Buhari ya amince da shawarar da ma’aikatar sufuri karkashin jagoranci</s>', 'score': 0.0010596115607768297,
'token': 170255,
'token_str': '▁Kasar'}]
📚 Documentation
Intended uses & limitations
Limitations and bias
This model is limited by its training dataset of entity - annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
Training data
This model was fine - tuned on [Hausa CC - 100](http://data.statmt.org/cc - 100/)
Training procedure
This model was trained on a single NVIDIA V100 GPU
Eval results on Test set (F - score, average over 5 runs)
Dataset |
XLM - R F1 |
ha_roberta F1 |
[MasakhaNER](https://github.com/masakhane - io/masakhane - ner) |
86.10 |
91.47 |
VOA Hausa Textclass |
|
|
BibTeX entry and citation info
By David Adelani