D

Distilbert Onnx

Developed by philschmid
This is a question-answering model fine-tuned on the SQuAD v1.1 dataset using knowledge distillation techniques, based on the DistilBERT-base-cased model.
Downloads 8,650
Release Time : 3/2/2022

Model Overview

This model is designed for question-answering tasks, capable of answering questions based on given context. It is a lightweight version of BERT-base-cased, retaining most of its performance through distillation techniques.

Model Features

Knowledge Distillation
Learns from the BERT-base-cased model through distillation techniques, retaining most of its performance while reducing model size.
Lightweight
Fewer parameters compared to the original BERT model, resulting in faster inference speed.
High Performance
Achieves an F1 score of 87.1 on the SQuAD v1.1 development set, close to BERT-base-cased's 88.7.

Model Capabilities

Reading Comprehension
Question Answering System
Text Understanding

Use Cases

Education
Automated Answering System
Helps students automatically answer questions based on textbook content
Can accurately answer questions based on text content
Customer Service
FAQ Auto-Response
Automatically answers common customer questions based on knowledge base content
Improves customer service efficiency and reduces manual intervention
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase