D

Distilbert Mrpc

Developed by mattchurgin
A text classification model fine-tuned on the GLUE dataset based on distilbert-base-uncased, used to determine whether sentence pairs are semantically equivalent
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is a fine-tuned version of DistilBERT, specifically designed for the MRPC (Microsoft Research Paraphrase Corpus) task, which determines whether two sentences express the same meaning.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller than the full BERT model but retains 97% of its performance.
High Accuracy
Achieves 84.8% accuracy and 89.35% F1 score on the MRPC task.
Fast Inference
The distilled model design enables faster inference compared to the original BERT.

Model Capabilities

Text classification
Semantic similarity judgment
Sentence pair analysis

Use Cases

Text processing
QA system deduplication
Identify similar questions to avoid duplicate answers
Improves QA system efficiency
Content moderation
Detect duplicate or highly similar inappropriate content
Enhances moderation efficiency
Educational technology
Student answer grading
Determine the semantic similarity between student answers and standard answers
Assists automatic grading systems
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase