# SQuAD Adaptation
Electra Squad Training
Apache-2.0
ELECTRA-small model fine-tuned on SQuAD dataset for question answering tasks
Question Answering System
Transformers

E
mlxen
20
0
Bert Finetuned Squad Legalbert
This model is a fine-tuned version of Legal-BERT on the SQuAD dataset, specializing in legal domain question-answering tasks.
Question Answering System
Transformers

B
Jasu
16
0
Bart Base Few Shot K 256 Finetuned Squad Seed 2
Apache-2.0
A question-answering model based on the BART-base architecture, fine-tuned on the SQuAD dataset, suitable for few-shot learning scenarios
Question Answering System
Transformers

B
anas-awadalla
13
0
Bart Base Few Shot K 256 Finetuned Squad Seed 0
Apache-2.0
This model is a fine-tuned version of facebook/bart-base on the SQuAD dataset, suitable for question-answering tasks.
Question Answering System
Transformers

B
anas-awadalla
13
0
Bart Base Few Shot K 128 Finetuned Squad Seed 2
Apache-2.0
BART-base model fine-tuned on SQuAD dataset, suitable for QA tasks
Question Answering System
Transformers

B
anas-awadalla
13
0
Roberta Base Squad2 Finetuned Squad
This model is a fine-tuned version of deepset/roberta-base-squad2, suitable for question answering tasks.
Question Answering System
Transformers

R
ms12345
14
0
Bert Finetuned Squad
Apache-2.0
A BERT-based question answering model fine-tuned on the SQuAD dataset
Question Answering System
Transformers

B
spasis
16
0
Roberta Base Fiqa Flm Sq Flit
A RoBERTa-base model fine-tuned for financial domain Q&A tasks, specifically designed for customized Q&A systems in banking, finance, and insurance sectors.
Question Answering System
Transformers

R
vanadhi
205
1
Roberta Base Few Shot K 128 Finetuned Squad Seed 42
MIT
A QA model fine-tuned on the SQuAD dataset using few-shot learning based on RoBERTa-base
Question Answering System
Transformers

R
anas-awadalla
19
0
Roberta Base Few Shot K 512 Finetuned Squad Seed 6
MIT
A question-answering model fine-tuned on the SQuAD dataset based on the RoBERTa-base model, suitable for reading comprehension tasks.
Question Answering System
Transformers

R
anas-awadalla
21
0
Roberta Base Few Shot K 256 Finetuned Squad Seed 6
MIT
A question-answering model fine-tuned on the SQuAD dataset based on RoBERTa-base, suitable for reading comprehension tasks
Question Answering System
Transformers

R
anas-awadalla
20
0
Roberta Base Few Shot K 1024 Finetuned Squad Seed 4
MIT
A QA model fine-tuned on the SQuAD dataset based on RoBERTa-base, suitable for reading comprehension tasks
Question Answering System
Transformers

R
anas-awadalla
19
0
Mobilebert Squadv2
A question-answering model fine-tuned on the Squad V2 dataset based on the Mobile-Bert architecture, suitable for resource-limited devices
Question Answering System
Transformers English

M
aware-ai
39
1
Albert Xxlarge V2 Squad2
A Q&A model based on ALBERT XXLarge architecture, fine-tuned for the SQuAD V2 dataset
Question Answering System
Transformers

A
mfeb
150
2
Bert Base Uncased Finetuned Squad
Apache-2.0
A Q&A model fine-tuned on the SQuAD dataset based on the BERT base model
Question Answering System
Transformers

B
kaporter
93
0
Featured Recommended AI Models