R

Roberta Medium Amharic

Developed by rasyosef
A RoBERTa model specifically designed for Amharic, which solves the problem of insufficient performance in Amharic NLP tasks through pre-training from scratch.
Downloads 132
Release Time : 1/6/2025

Model Overview

A medium-sized pre-trained model based on the RoBERTa architecture, optimized for Amharic and supporting NLP tasks such as sentiment classification and named entity recognition.

Model Features

Efficient performance
With only 42 million parameters, it outperforms multilingual models with 7 times the number of parameters on Amharic tasks.
Professional training
Trained from scratch using 290 million Amharic tokens, including a dedicated tokenizer.
Fast training
Training can be completed in only 15 hours on an A100 40GB GPU.

Model Capabilities

Amharic text understanding
Sentiment analysis
Named entity recognition
Masked language modeling

Use Cases

Sentiment analysis
Sentiment classification of Amharic reviews
Judge the positive/negative sentiment of Amharic user reviews.
F1 score of 0.84 (macro-average)
Information extraction
Amharic named entity recognition
Identify entities such as person names and place names from Amharic text.
F1 score of 0.75 (macro-average)
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase