X

XLMR MaltBERTa

Developed by MaCoCu
A language model based on large-scale pre-training of Maltese text, further trained on the XLM-RoBERTa-large foundation
Downloads 20
Release Time : 8/11/2022

Model Overview

XLMR-MaltBERTa is a language model specifically optimized for Maltese, suitable for various natural language processing tasks.

Model Features

Maltese Optimization
Specially designed for large-scale pre-training of Maltese, providing better language understanding capabilities
Based on XLM-RoBERTa-large
Further trained on the powerful XLM-RoBERTa-large model, inheriting its excellent features
Large-scale Training Data
Trained using 3.2GB of Maltese text (439 million tokens)

Model Capabilities

Text understanding
Part-of-speech tagging
Language reasoning

Use Cases

Natural Language Processing
Part-of-speech Tagging
Performing part-of-speech tagging on the UPOS/XPOS benchmarks of the Universal Dependencies project
Achieved high accuracy rates of 98.1 (UPOS) and 98.2 (XPOS) on the test set
Language Reasoning
Conducting language reasoning on the Google-translated COPA dataset
Achieved an accuracy rate of 54.4 on the test set
Featured Recommended AI Models
ยฉ 2025AIbase