Bert Base Spanish Wwm Cased Xnli
Zero-shot classification model fine-tuned on XNLI Spanish dataset based on Spanish BERT base model
Text Classification Supports Multiple LanguagesOpen Source License:MIT#Zero-shot classification#Spanish NLI#Multi-label inference
Downloads 1,957
Release Time : 3/2/2022
Model Overview
This model is a Spanish BERT model for zero-shot classification tasks, particularly suitable for natural language inference tasks. It is based on the whole word masking Spanish BERT base model and has been fine-tuned on the XNLI Spanish dataset.
Model Features
Zero-shot classification capability
Can perform classification tasks without task-specific training data
Whole word masking pre-training
Pre-trained using whole word masking technique to enhance model understanding
XNLI dataset fine-tuning
Fine-tuned on XNLI Spanish dataset to optimize natural language inference performance
Model Capabilities
Zero-shot text classification
Natural language inference
Multi-label classification
Use Cases
Text classification
News classification
Classify news articles into predefined categories such as culture, society, economy, etc.
In the example, the 'culture' category received the highest score of 0.389
Content moderation
Identify sensitive categories to which text content belongs
Natural language understanding
Semantic similarity judgment
Determine whether two texts express the same meaning
Featured Recommended AI Models