Mbertu
A multilingual model for Maltese pre-trained on the Maltese corpus v4.0 based on multilingual BERT initial checkpoints
Downloads 302
Release Time : 4/14/2022
Model Overview
mBERTu is a multilingual BERT model specifically optimized for Maltese, supporting natural language processing tasks such as dependency parsing, part-of-speech tagging, named entity recognition, and sentiment analysis.
Model Features
Maltese Optimization
Specifically pre-trained and optimized for the low-resource language Maltese
Multitask Support
Supports various natural language processing tasks including syntactic parsing, POS tagging, named entity recognition, and sentiment analysis
High Performance
Outstanding performance in various Maltese NLP tasks, such as achieving 98.66% accuracy in POS tagging
Model Capabilities
Dependency parsing
POS tagging
Named entity recognition
Sentiment analysis
Text understanding
Use Cases
Natural Language Processing
Maltese Text Analysis
Used for analyzing and processing Maltese text data
Achieved 92.10% unlabeled attachment score on the MUDT dataset
Maltese Sentiment Analysis
Analyzes sentiment tendencies in Maltese text
Macro-average F1 score reached 76.79
Maltese Educational Applications
Can be used as a tool for Maltese language learning and teaching assistance
Featured Recommended AI Models
Š 2025AIbase