Bertu
BERTu is a monolingual Maltese model based on the BERT architecture, specifically designed for low-resource languages, supporting various natural language processing tasks.
Downloads 4,486
Release Time : 4/14/2022
Model Overview
BERTu is a monolingual Maltese model pre-trained from scratch on Korpus Malti v4.0, based on the BERT (base) architecture. It supports multiple natural language processing tasks including dependency parsing, part-of-speech tagging, named entity recognition, and sentiment analysis.
Model Features
Low-resource Language Optimization
Designed specifically for low-resource languages like Maltese, it enhances model performance through high-quality corpus pre-training.
Multi-task Support
Supports various natural language processing tasks, including dependency parsing, part-of-speech tagging, named entity recognition, and sentiment analysis.
High Performance
Excels in multiple tasks, achieving a UAS of 92.31 in dependency parsing and an accuracy of over 98.5% in part-of-speech tagging.
Model Capabilities
Dependency parsing
Part-of-speech tagging
Named entity recognition
Sentiment analysis
Use Cases
Natural Language Processing
Dependency Parsing
Analyzes the syntactic structure of Maltese sentences, identifying dependency relationships between words.
Unlabeled Attachment Score (UAS) 92.31, Labeled Attachment Score (LAS) 88.14.
Part-of-Speech Tagging
Tags each word in Maltese text with its part of speech.
Universal POS tagging accuracy (UPOS) 98.58, language-specific POS tagging accuracy (XPOS) 98.54.
Named Entity Recognition
Identifies named entities (e.g., person names, locations, organizations) in Maltese text.
Span-based F1 score 86.77.
Sentiment Analysis
Analyzes the sentiment polarity (e.g., positive, negative, neutral) of Maltese text.
Macro-average F1 score 78.96.
Featured Recommended AI Models
Š 2025AIbase