A

Afrolm Active Learning

Developed by bonadossou
AfroLM is a pretrained language model optimized for 23 African languages, employing an active learning framework to achieve high performance with minimal data
Downloads 132
Release Time : 10/28/2022

Model Overview

This model is efficiently pretrained on 23 African languages through an innovative active learning framework, excelling in tasks like named entity recognition and text classification while requiring less than 14% of competitors' training data to achieve comparable performance

Model Features

Efficient Data Utilization
Achieves comparable performance using less than 14% of competitors' pretraining data
Multilingual Support
Covers 23 African languages, including many low-resource languages
Active Learning Framework
Employs an innovative active learning approach to optimize training
Lightweight & Efficient
Features fewer parameters and higher efficiency compared to similar models

Model Capabilities

Named Entity Recognition
Text Classification
Sentiment Analysis
Cross-lingual Transfer Learning

Use Cases

Natural Language Processing
African Language Named Entity Recognition
Entity recognition on MasakhaNER dataset
Average F1 scores: 80.13 (MasakhaNER1.0) and 83.26 (MasakhaNER2.0)
Text Classification
Yoruba and Hausa text classification
Achieved accuracy rates of 82.90% and 91.00% respectively
Sentiment Analysis
Yoruba social media (YOSM) sentiment analysis
85.40% accuracy
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase