Bert Base Multilingual Cased Finetuned Squad
B
Bert Base Multilingual Cased Finetuned Squad
Developed by salti
This is a question-answering model fine-tuned on the Stanford Question Answering Dataset (SQuADv1.1) based on the multilingual BERT model, supporting reading comprehension tasks in multiple languages.
Downloads 28
Release Time : 3/2/2022
Model Overview
By fine-tuning on the multilingual BERT foundation, this model can handle question-answering tasks in various languages, demonstrating excellent performance on SQuADv1.1 and possessing cross-lingual transfer capabilities.
Model Features
Multilingual Support
The model supports question-answering tasks in multiple languages, including but not limited to 11 languages such as Arabic, English, and Chinese.
Cross-lingual Transfer Ability
Demonstrates good zero-shot transfer capabilities in the XQuAD cross-lingual question-answering benchmark.
Efficient Training
Utilizes mixed-precision training and gradient accumulation techniques to efficiently complete training on Tesla P100 GPUs.
Model Capabilities
Multilingual Q&A
Reading Comprehension
Cross-lingual Transfer Learning
Text Understanding
Use Cases
Education
Multilingual Learning Assistance
Helps students understand reading materials in different languages and answer questions
Customer Service
Multilingual FAQ System
Builds an automated question-answering system supporting multiple languages
Featured Recommended AI Models
Š 2025AIbase