B

Bert Tiny Finetuned Mrpc

Developed by M-FAC
This model is based on the BERT-tiny architecture and fine-tuned for text classification on the MRPC dataset using the M-FAC second-order optimizer.
Downloads 46
Release Time : 3/2/2022

Model Overview

The model is primarily designed for sentence pair classification tasks, specifically optimized for performance on the MRPC (Microsoft Research Paraphrase Corpus) dataset.

Model Features

M-FAC second-order optimization
Utilizes the advanced M-FAC second-order optimizer for fine-tuning, demonstrating superior performance compared to traditional Adam optimizer.
Lightweight architecture
Based on the BERT-tiny architecture with fewer parameters, making it suitable for resource-constrained environments.
Robust performance
Exhibits stable performance across multiple runs with small standard deviations.

Model Capabilities

Text classification
Sentence similarity judgment
Semantic equivalence detection

Use Cases

Natural Language Processing
Text paraphrase detection
Determines whether two sentences are paraphrases of each other
Achieves F1 score of 83.12 on MRPC dataset
Semantic similarity analysis
Evaluates the semantic similarity between two sentences
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase