Distilbert Base Uncased Finetuned Mrpc
This model is a text classification model fine-tuned on the GLUE MRPC task based on DistilBERT, used to determine whether sentence pairs are semantically equivalent
Downloads 18
Release Time : 3/2/2022
Model Overview
A lightweight text classification model based on DistilBERT, specifically optimized for sentence pair semantic equivalence judgment tasks, performing well on the MRPC dataset
Model Features
Lightweight and Efficient
Based on the DistilBERT architecture, it is 40% smaller in size than standard BERT while retaining 97% of its performance
High Accuracy
Achieves 84.56% accuracy and 89.59% F1 score on the MRPC test set
Fast Inference
The distilled architecture design makes the model inference speed 60% faster than the original BERT
Model Capabilities
Text Classification
Semantic Similarity Judgment
Sentence Pair Relationship Analysis
Use Cases
Text Processing
Paraphrase Detection
Determine whether two sentences express the same meaning
Accuracy 84.56%
Question Answering Systems
Identify similar questions to provide unified answers
Content Moderation
Duplicate Content Detection
Identify the same content expressed differently
Featured Recommended AI Models
Š 2025AIbase