D

Distilbert Base Uncased Distilled Squad

Developed by distilbert
DistilBERT is a lightweight distilled version of BERT, with 40% fewer parameters and 60% faster speed, maintaining over 95% of BERT's performance on the GLUE benchmark. This model is fine-tuned specifically for question answering tasks.
Downloads 154.39k
Release Time : 3/2/2022

Model Overview

A fine-tuned model based on DistilBERT-base-uncased, trained using knowledge distillation on the SQuAD v1.1 dataset, suitable for English question answering tasks.

Model Features

Efficient and Lightweight
Compared to the original BERT model, it has 40% fewer parameters and 60% faster inference speed.
High Performance
Maintains over 95% of BERT's performance on the GLUE benchmark.
Q&A Optimized
Fine-tuned specifically for the SQuAD question answering task, achieving an 86.9 F1 score on SQuAD v1.1.

Model Capabilities

Extractive Question Answering
Text Understanding
Answer Localization

Use Cases

Q&A System
Document-based Q&A
Extract answers to questions from given text
Achieves 86.9 F1 score on SQuAD v1.1 dataset
Knowledge Retrieval
Search for relevant information from a knowledge base
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase