# Multilingual pretraining

Multilingual Albert Base Cased 128k
Apache-2.0
A multilingual ALBERT model pretrained with masked language modeling (MLM) objective, supporting 60+ languages, featuring a lightweight architecture with parameter sharing
Large Language Model Transformers Supports Multiple Languages
M
cservan
277
2
Multilingual Albert Base Cased 32k
Apache-2.0
Multilingual ALBERT model pretrained with masked language modeling objective, supporting 50+ languages, case-sensitive
Large Language Model Transformers Supports Multiple Languages
M
cservan
243
2
Infoxlm German Question Answering
German question-answering system model fine-tuned based on InfoXLM-large, trained on GermanQuAD and SQuAD datasets
Question Answering System Transformers German
I
svalabs
145
3
Mt5 Base Dacsa Es
This model is a fine-tuned version of the mT5 base model specifically for Spanish text summarization tasks, particularly suitable for generating summaries of news articles.
Text Generation Transformers Spanish
M
ELiRF
154
2
Wav2vec2 Base 10k Voxpopuli Ft Nl
A speech recognition model based on Facebook's Wav2Vec2 architecture, pretrained on 10K hours of unlabeled Dutch data from the VoxPopuli corpus and fine-tuned on Dutch transcription data.
Speech Recognition Transformers Other
W
facebook
28
0
Xlm Roberta Base English Upos
English POS tagging and dependency parsing model based on XLM-RoBERTa
Sequence Labeling Transformers Supports Multiple Languages
X
KoichiYasuoka
21
2
Xlm Mlm 17 1280
The XLM model is a cross-lingual pretrained model based on text in 17 languages, using the masked language modeling (MLM) objective
Large Language Model Transformers Supports Multiple Languages
X
FacebookAI
201
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase