# Multilingual Understanding

Qwen3 30B A3B Base
Apache-2.0
Qwen3-30B-A3B-Base is the latest 30.5B parameter-scale Mixture of Experts (MoE) large language model in the Qwen series, supporting 119 languages and 32k context length.
Large Language Model Transformers
Q
Qwen
9,745
33
Gte Qwen2 1.5B Instruct
Apache-2.0
A general-purpose text embedding model based on Qwen2-1.5B, supporting multilingual and long-text processing
Text Embedding Transformers
G
Alibaba-NLP
242.12k
207
Yi 1.5 34B Chat
Apache-2.0
Yi-1.5 is an upgraded version of the Yi model, demonstrating superior performance in programming, mathematics, reasoning, and instruction-following capabilities while maintaining excellent language understanding, commonsense reasoning, and reading comprehension abilities.
Large Language Model Transformers
Y
01-ai
70.62k
270
Mminilmv2 L6 H384
MIT
Re-uploaded version of Microsoft's multilingual MiniLM v2 model, optimized for use with the HuggingFace transformers library
Large Language Model Transformers
M
hotchpotch
98
1
Mminilmv2 L12 H384
MIT
mMiniLMv2-L12-H384 is a version of Microsoft's multilingual MiniLM v2 model, designed to provide a more convenient user experience through the HuggingFace transformers library.
Large Language Model Transformers
M
hotchpotch
21
2
Mplug Owl Bloomz 7b Multilingual
Apache-2.0
mPLUG-Owl is a multilingual vision-language model that supports image understanding and multi-turn dialogue, developed based on the BLOOMZ-7B architecture.
Image-to-Text Transformers Supports Multiple Languages
M
MAGAer13
29
9
Lilt Xlm Roberta Base Finetuned With DocLayNet Base At Linelevel Ml384
MIT
A line-level document understanding model fine-tuned based on LiLT and DocLayNet dataset, supporting multilingual document layout analysis
Image-to-Text Transformers Supports Multiple Languages
L
pierreguillou
700
12
Canine C Squad
CANINE-C is a character-based pre-trained Transformer model developed by Google, specifically designed for question answering tasks involving long text sequences.
Question Answering System Transformers
C
Splend1dchan
68
0
Xlm Roberta Base
XLM-RoBERTa is a multilingual pre-trained model based on the RoBERTa architecture, supporting 100 languages and suitable for cross-lingual understanding tasks.
Large Language Model Transformers
X
kornesh
30
1
Xlm Roberta Large
XLM-RoBERTa-large is a multilingual pretrained language model based on the RoBERTa architecture, supporting various natural language processing tasks in multiple languages.
Large Language Model Transformers
X
kornesh
2,154
0
Xlmr Large Qa Fa
A Persian Q&A system fine-tuned based on the XLM-RoBERTa large model, trained on the PersianQA dataset, supporting Persian and multilingual Q&A tasks.
Question Answering System Transformers Other
X
m3hrdadfi
65
5
Cino Large V2
Apache-2.0
A multilingual pretrained model for Chinese and 7 minority languages in China
Large Language Model Transformers Supports Multiple Languages
C
hfl
110
11
Xlmroberta For VietnameseQA
MIT
Vietnamese Q&A model fine-tuned from xlm-roberta-base, trained on UIT-Viquad_v2 dataset
Question Answering System Transformers
X
hogger32
54
0
Roberta Large Mnli
XLM-RoBERTa is a multilingual pretrained model based on the RoBERTa architecture, supporting 100 languages and excelling in cross-lingual understanding tasks.
Large Language Model Transformers Other
R
typeform
119
7
Xlm Roberta Base
MIT
XLM-RoBERTa is a multilingual model pretrained on 2.5TB of filtered CommonCrawl data across 100 languages, using masked language modeling as the training objective.
Large Language Model Supports Multiple Languages
X
FacebookAI
9.6M
664
Zeroaraelectra
Other
A zero-shot classification model for Arabic, supporting natural language inference tasks
Text Classification Transformers Supports Multiple Languages
Z
KheireddineDaouadi
39
0
BERT Responsible AI
BERT is a pre-trained language model based on the Transformer architecture, capable of handling various natural language processing tasks in multiple languages.
Large Language Model Transformers
B
Mustang
15
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase