M

Multilingual Albert Base Cased 128k

Developed by cservan
A multilingual ALBERT model pretrained with masked language modeling (MLM) objective, supporting 60+ languages, featuring a lightweight architecture with parameter sharing
Downloads 277
Release Time : 12/20/2023

Model Overview

This is a case-sensitive multilingual ALBERT model, pretrained in a self-supervised manner on Wikipedia text, suitable for fine-tuning downstream tasks. The model employs a Transformer layer weight-sharing mechanism with smaller memory footprint.

Model Features

Multilingual support
Supports processing of over 60 languages, including major European and Asian languages
Parameter-shared architecture
Utilizes ALBERT's unique Transformer layer weight-sharing mechanism, significantly reducing model parameters
Case-sensitive
Unlike standard ALBERT, this model can distinguish between uppercase and lowercase word forms
Efficient pretraining
Combines both masked language modeling (MLM) and sentence order prediction (SOP) pretraining objectives

Model Capabilities

Multilingual text understanding
Sentence order prediction
Masked word prediction
Downstream task fine-tuning

Use Cases

Natural Language Processing
Slot filling tasks
Used for information extraction tasks in dialogue systems
Achieved 89.14 accuracy on MultiATIS++ dataset
Text classification
Used for multilingual text classification tasks
Achieved 96.84 accuracy on SNIPS dataset
Named entity recognition
Used for identifying named entities in text
Achieved 88.27 F1 score on CoNLL2003 dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase