Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual Twitter pre-training
# Multilingual Twitter pre-training
Arabic Xlm Xnli
MIT
Based on the XLM-Roberta-base model, continuously pre-trained on Arabic Twitter corpus and fine-tuned on the XNLI Arabic dataset for zero-shot text classification.
Text Classification
Transformers
Arabic
A
morit
268
0
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase