M

Multilingual Albert Base Cased 64k

Developed by cservan
A multilingual ALBERT model pretrained with masked language modeling (MLM) objective, supporting 64k vocabulary size and case sensitivity
Downloads 52
Release Time : 12/20/2023

Model Overview

mALBERT is a transformers model pretrained in a self-supervised manner on multilingual Wikipedia texts, featuring a shared-weight Transformer layer architecture suitable for fine-tuning downstream tasks.

Model Features

Multilingual Support
Supports processing for 50+ languages, including major European and Asian languages
Shared Weight Architecture
Utilizes ALBERT's unique shared-weight Transformer layers to reduce memory usage
Case Sensitivity
Effectively distinguishes between uppercase and lowercase word variations (e.g., 'french' vs. 'French')
Efficient Pretraining
Pretrained with dual objectives of masked language modeling and sentence order prediction

Model Capabilities

Multilingual text understanding
Sentence order prediction
Downstream task feature extraction
Masked token prediction

Use Cases

Natural Language Processing
Slot Filling Task
Information extraction applications on datasets like MMNLU and MultiATIS++
Achieved 88.88% accuracy on MultiATIS++
Text Classification
Handling multilingual text classification tasks
Achieved 71.26% accuracy on MMNLU classification tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase