Crosloengual Bert
A trilingual model based on bert-base architecture, specializing in Croatian, Slovenian, and English performance, outperforming multilingual BERT
Large Language Model Supports Multiple Languages#Trilingual BERT#Cross-lingual transfer#South Slavic language optimization
Downloads 510
Release Time : 3/2/2022
Model Overview
CroSloEngual BERT is a trilingual BERT model optimized for Croatian, Slovenian, and English, providing better performance than general multilingual BERT while retaining cross-lingual knowledge transfer capabilities.
Model Features
Trilingual Optimization
Specifically optimized for Croatian, Slovenian, and English, outperforming general multilingual BERT models
Cross-lingual Knowledge Transfer
Retains cross-lingual knowledge transfer capabilities lacking in monolingual models
Academic Validation
Model performance validated in academic papers, with results published at TSD 2020 conference
Model Capabilities
Multilingual text understanding
Cross-lingual knowledge transfer
Natural language processing tasks
Use Cases
Multilingual Processing
Croatian-Slovenian-English Text Analysis
Semantic analysis and understanding of texts in three languages
Better performance compared to multilingual BERT
Cross-lingual Information Retrieval
Information retrieval and matching across three languages
Improved retrieval effectiveness leveraging cross-lingual knowledge transfer
Featured Recommended AI Models