# Q&A System

Fid T5 Large Nq
This model is based on a decoder fusion architecture, using the t5-large checkpoint, trained on the Natural Questions dataset for Q&A system tasks.
Question Answering System Transformers English
F
Intel
156
3
Minilm NaturalQuestions
MIT
A model fine-tuned on the NaturalQuestions dataset based on microsoft/MiniLM-L12-H384-uncased, suitable for question-answering tasks
Large Language Model Transformers
M
remunds
31
1
Electra Qa
Electra-qa is a Q&A model based on the ELECTRA architecture, suitable for question-answering tasks.
Question Answering System Transformers
E
vaibhav9
14
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
Lightweight Q&A model based on DistilBERT, fine-tuned on the SQuAD dataset
Question Answering System Transformers
D
shizil
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a question-answering model fine-tuned on the SQuAD dataset based on DistilBERT, designed for reading comprehension tasks.
Question Answering System Transformers
D
BillZou
14
0
Bert Base Uncased Finetuned Triviaqa
Apache-2.0
A Q&A model fine-tuned on the TriviaQA dataset based on bert-base-uncased
Question Answering System Transformers
B
FabianWillner
191
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
A Q&A model fine-tuned on the SQuAD dataset based on the DistilBERT model
Question Answering System Transformers
D
shaojie
16
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a fine-tuned version of DistilBERT on the SQuAD Q&A dataset, suitable for question-answering tasks.
Question Answering System Transformers
D
anu24
16
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a question-answering model fine-tuned on the SQuAD v2 dataset based on the DistilBERT base model, suitable for reading comprehension tasks.
Question Answering System Transformers
D
lorenzkuhn
15
0
Bert Base Uncased Squad V2.0 Finetuned
Apache-2.0
This model is a fine-tuned version of bert-base-uncased on the squad_v2 dataset, suitable for question-answering tasks.
Question Answering System Transformers
B
kamalkraj
84
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
A Q&A model based on DistilBERT, fine-tuned on the SQuAD dataset for reading comprehension tasks
Question Answering System Transformers
D
ak987
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a question-answering model fine-tuned on the SQuAD dataset based on DistilBERT, designed for extractive question-answering tasks.
Question Answering System Transformers
D
sam999
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
A model fine-tuned on Q&A datasets based on Distilled BERT Base, suitable for Q&A tasks
Question Answering System Transformers
D
jhoonk
15
0
Albert Xlarge Finetuned
The xlarge version v2 model based on ALBERT architecture, fine-tuned on the SQuAD V2 dataset for Q&A tasks
Question Answering System Transformers
A
123tarunanand
16
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a fine-tuned version of DistilBERT on the SQuAD v2 question-answering dataset, designed for Q&A tasks.
Question Answering System Transformers
D
kiana
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a fine-tuned version of the DistilBERT base model on the SQuAD Q&A dataset, suitable for question-answering tasks.
Question Answering System Transformers
D
jsunster
16
0
Bert Base Cased Squad2
This is a base model based on the BERT architecture, specifically trained on the SQuAD v2 dataset, suitable for question-answering tasks.
Question Answering System Transformers
B
ydshieh
39
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a question-answering model fine-tuned on the SQuAD v2 dataset based on DistilBERT, suitable for reading comprehension tasks.
Question Answering System Transformers
D
21iridescent
19
0
Mobilebert Uncased Squad V2
MIT
A Q&A model based on the MobileBERT architecture, fine-tuned for the SQuAD v2 dataset
Question Answering System Transformers
M
vumichien
32
0
Roberta Base Squad2 Finetuned Squad
This model is a question-answering model fine-tuned on the SQuAD 2.0 dataset based on RoBERTa-base, excelling in reading comprehension tasks.
Question Answering System Transformers
R
deepakvk
14
0
Roberta Base Finetuned Squad2
MIT
A Q&A model fine-tuned on the SQuAD 2.0 dataset based on the RoBERTa-base model
Question Answering System Transformers
R
mvonwyl
19
0
Bert Medium Finetuned Squadv2
A Q&A model fine-tuned on the SQuAD2.0 dataset based on the BERT-Medium architecture, designed for resource-constrained environments
Question Answering System English
B
mrm8488
1,399
1
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a lightweight version based on DistilBERT, fine-tuned on the SQuAD question-answering dataset for Q&A tasks.
Question Answering System Transformers
D
nikcook
20
0
T5 Qa Squad2neg En
MIT
A Q&A system model based on T5-small architecture, supporting extractive Q&A tasks and capable of handling both answerable and unanswerable scenarios
Question Answering System Supports Multiple Languages
T
ThomasNLG
533
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
A question-answering model based on DistilBERT, fine-tuned on the SQuAD dataset for reading comprehension tasks
Question Answering System Transformers
D
hark99
20
0
Bert Base Uncased Squad1.1 Block Sparse 0.20 V1
MIT
This is a pruned and optimized BERT Q&A model, retaining 38.1% of the original model's weights, fine-tuned on the SQuAD1.1 dataset, supporting English Q&A tasks.
Question Answering System Transformers English
B
madlag
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a fine-tuned version of distilbert-base-uncased on the SQuAD question-answering dataset, specifically designed for Q&A system tasks.
Question Answering System Transformers
D
avioo1
19
0
Albert Base V2 Squad2
A Q&A model fine-tuned on the SQuAD v2 dataset based on the ALBERT base v2 architecture, excelling in reading comprehension tasks involving unanswerable questions
Question Answering System Transformers
A
twmkn9
4,152
4
Bert Small Finetuned Squadv2
BERT-Small is a compact BERT model developed by Google Research, fine-tuned on the SQuAD 2.0 dataset, suitable for Q&A tasks in resource-constrained environments.
Question Answering System English
B
mrm8488
314
1
Electra Small Finetuned Squadv2
Apache-2.0
A Q&A model fine-tuned on the SQuAD v2.0 dataset based on Electra-small-discriminator, suitable for distinguishing answerable from unanswerable questions
Question Answering System Transformers English
E
mrm8488
51
1
Bert Finetuned Squad
A Q&A model fine-tuned on the SQuAD dataset based on the BERT architecture
Question Answering System Transformers
B
huggingface-course
399
8
Albert Base V2 Finetuned Squad
Apache-2.0
A fine-tuned version of albert-base-v2 on Q&A datasets, suitable for question-answering tasks
Question Answering System Transformers
A
knlu1016
16
0
Cs224n Squad2.0 Albert Base V2
ALBERT-base-v2 model provided for Stanford CS224n course students, used for SQuAD2.0 Q&A task benchmarking
Question Answering System Transformers
C
elgeish
169
0
Bert Base Uncased Finetuned QnA V1
Apache-2.0
A fine-tuned version of bert-base-uncased model for Q&A tasks, suitable for English question answering
Question Answering System Transformers
B
mujerry
23
0
Bert Base Uncased Finetuned Squad
Apache-2.0
A Q&A system model fine-tuned on the SQuAD1.1 dataset based on the bert-base-uncased model
Question Answering System Transformers
B
victoraavila
18
0
Bert Base Uncased Finetuned Squad
Apache-2.0
A Q&A model fine-tuned on the SQuAD dataset based on the BERT base model
Question Answering System Transformers
B
kaporter
93
0
Microsoft Deberta Large Squad
This model is a fine-tuned version of microsoft_deberta-large on the SQuAD V1 dataset, primarily used for question answering tasks.
Question Answering System Transformers
M
Palak
112
0
Distilbert Base Uncased Distilled Squad
Apache-2.0
DistilBERT is a lightweight distilled version of BERT, with 40% fewer parameters and 60% faster speed, maintaining over 95% of BERT's performance on the GLUE benchmark. This model is fine-tuned specifically for question answering tasks.
Question Answering System Transformers English
D
distilbert
154.39k
115
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase