Distilbert Base Uncased Finetuned Rte
This model is a text classification model fine-tuned on the GLUE dataset's RTE task based on DistilBERT, with an accuracy of 61.73%
Downloads 20
Release Time : 3/2/2022
Model Overview
A lightweight text classification model specifically designed for recognizing textual entailment relationships (RTE task)
Model Features
Lightweight and Efficient
Based on the DistilBERT architecture, it is 40% smaller than standard BERT while retaining 97% of its performance
Task-Specific Optimization
Specially fine-tuned for the RTE (Textual Entailment Recognition) task in the GLUE dataset
Model Capabilities
Text Classification
Textual Entailment Recognition
Natural Language Inference
Use Cases
Natural Language Processing
Textual Entailment Judgment
Determine whether the premise text entails the hypothesis text
Achieved 61.73% accuracy on the GLUE RTE task
Question Answering System Support
Assist question-answering systems in determining whether an answer is relevant to a question
Featured Recommended AI Models
Š 2025AIbase