X

Xlm Roberta Base Finetuned Amharic

Developed by Davlan
A model fine-tuned on Amharic text based on XLM-RoBERTa, outperforming the original XLM-RoBERTa in Named Entity Recognition tasks
Downloads 81
Release Time : 3/2/2022

Model Overview

This model is a RoBERTa variant optimized for Amharic, primarily used for natural language processing tasks, with particular strength in Named Entity Recognition.

Model Features

Amharic optimization
Specifically fine-tuned for Amharic, performing better than the generic XLM-RoBERTa model on tasks in this language
Performance improvement
Achieved a 7-point higher F1 score than the original XLM-RoBERTa on the MasakhaNER dataset
Transfer learning
Fine-tuned based on the powerful multilingual XLM-RoBERTa-base model

Model Capabilities

Amharic text understanding
Named Entity Recognition
Masked language prediction

Use Cases

Natural Language Processing
Amharic news entity recognition
Identify named entities from Amharic news texts
Achieved 77.97 F1 score on the MasakhaNER dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase