D

Distilbert Base Uncased

Developed by distilbert
DistilBERT is a distilled version of the BERT base model, maintaining similar performance while being more lightweight and efficient, suitable for natural language processing tasks such as sequence classification and token classification.
Downloads 11.1M
Release Time : 3/2/2022

Model Overview

A lightweight language model based on the Transformer architecture, compressed from the BERT base model through knowledge distillation technology, supporting English text understanding tasks.

Model Features

Efficient Distillation
Retains 97% of BERT-base's performance through knowledge distillation technology while reducing model size by 40%
Fast Inference
Inference speed improved by 60% compared to the original BERT model
Multi-Task Adaptability
Supports fine-tuning for downstream tasks, suitable for various natural language processing scenarios

Model Capabilities

Text Feature Extraction
Masked Word Prediction
Sentence Semantic Understanding
Text Classification
Question Answering

Use Cases

Text Analysis
Sentiment Analysis
Classify review content into positive/negative sentiment
Achieved 91.3% accuracy on the SST-2 dataset
Information Extraction
Named Entity Recognition
Identify entities such as person names, locations, and organizations from text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase