# English Reading Comprehension

Distilbert Base Uncased Finetuned Squad
Apache-2.0
A question-answering model fine-tuned on the SQuAD dataset based on distilbert-base-uncased
Question Answering System Transformers
D
LeWince
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a fine-tuned version of distilbert-base-uncased on the SQuAD question answering dataset, suitable for QA tasks.
Question Answering System Transformers
D
sasuke
16
0
Mobilebert Uncased Squad V1
MIT
MobileBERT is a lightweight version of BERT_LARGE, featuring a bottleneck structure design that balances self-attention mechanisms and feed-forward networks. This model is fine-tuned on the SQuAD1.1 dataset and is suitable for question-answering tasks.
Question Answering System Transformers English
M
csarron
160
0
Albert Base V2 Squad2
A Q&A model fine-tuned on the SQuAD v2 dataset based on the ALBERT base v2 architecture, excelling in reading comprehension tasks involving unanswerable questions
Question Answering System Transformers
A
twmkn9
4,152
4
Mobilebert Squadv2
A question-answering model fine-tuned on the Squad V2 dataset based on the Mobile-Bert architecture, suitable for resource-limited devices
Question Answering System Transformers English
M
aware-ai
39
1
Bert Small Finetuned Squadv2
BERT-Small is a compact BERT model developed by Google Research, fine-tuned on the SQuAD 2.0 dataset, suitable for Q&A tasks in resource-constrained environments.
Question Answering System English
B
mrm8488
314
1
Mobilebert Uncased Finetuned Squadv2
A QA model fine-tuned based on the lightweight MobileBERT architecture, specifically optimized for the SQuAD v2 dataset, capable of handling both answerable and unanswerable questions.
Question Answering System Transformers English
M
mrm8488
68
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase