D

Distilbert Mlm 750k

Developed by vocab-transformers
DistilBERT is a lightweight distilled version of BERT, retaining most of the performance but with fewer parameters.
Downloads 26
Release Time : 4/2/2022

Model Overview

DistilBERT is a lightweight model compressed from the BERT model through knowledge distillation technology, primarily used for natural language processing tasks such as text classification, question answering, and named entity recognition.

Model Features

Lightweight and efficient
Through knowledge distillation technology, the model parameters are reduced by 40% compared to the original BERT while retaining 97% of the performance.
Multi-task support
Supports various natural language processing tasks, including text classification, question answering, and named entity recognition.
Rich pre-training data
Pre-trained on datasets such as C4, MSMARCO, Wikipedia, S2ORC, and news datasets.

Model Capabilities

Text classification
Question answering system
Named entity recognition
Text embedding

Use Cases

Text analysis
Sentiment analysis
Analyze the sentiment tendency of text (positive, negative, neutral).
Performs close to the original BERT model on multiple benchmark datasets.
Spam detection
Identify and classify spam or harmful content.
Information extraction
Named entity recognition
Extract entities such as person names, place names, and organization names from text.
Question answering system
Answer user questions based on given text.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase