Bert Tiny Finetuned Glue Rte
B
Bert Tiny Finetuned Glue Rte
Developed by muhtasham
This is a text classification model based on the BERT-tiny architecture, fine-tuned on the GLUE RTE task, primarily used for textual entailment recognition tasks.
Downloads 37
Release Time : 8/1/2022
Model Overview
This model is a fine-tuned version of google/bert_uncased_L-2_H-128_A-2 on the GLUE dataset, specifically designed for textual entailment recognition (RTE) in text classification tasks.
Model Features
Lightweight BERT Model
Based on the BERT-tiny architecture, the model has a small size, making it suitable for resource-constrained environments.
Optimized for GLUE RTE Task
Specifically fine-tuned for the textual entailment recognition (RTE) task in the GLUE dataset.
Efficient Training
Utilizes a linear learning rate scheduler and Adam optimizer for efficient training.
Model Capabilities
Text Classification
Textual Entailment Recognition
Natural Language Understanding
Use Cases
Natural Language Processing
Textual Entailment Judgment
Determine whether one sentence entails the meaning of another sentence.
Achieved 63.18% accuracy on the GLUE RTE task.
Text Relationship Analysis
Analyze the logical relationship between two pieces of text.
Featured Recommended AI Models
Š 2025AIbase