B

Bert Mini Finetuned Mnli

Developed by M-FAC
This model is based on the BERT-mini architecture, fine-tuned on the MNLI dataset using the M-FAC second-order optimizer for text classification tasks.
Downloads 290.56k
Release Time : 3/2/2022

Model Overview

The model is primarily designed for natural language inference tasks, with performance improvements on the MNLI dataset achieved through the M-FAC optimizer.

Model Features

M-FAC second-order optimization
Fine-tuned using the advanced M-FAC second-order optimizer, demonstrating performance improvements over traditional Adam optimizer.
Lightweight architecture
Based on the BERT-mini architecture with fewer parameters, making it suitable for resource-constrained environments.
Robust performance
Exhibits stable performance across multiple runs with small standard deviations.

Model Capabilities

Text classification
Natural language inference

Use Cases

Text understanding
Natural language inference
Determining the relationship between two sentences (entailment, contradiction, or neutral)
Achieves approximately 75% accuracy on the MNLI validation set.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase