B

Bert Tiny Finetuned Qnli

Developed by M-FAC
This model is a BERT-tiny model fine-tuned on the QNLI dataset using the M-FAC second-order optimizer, demonstrating better performance compared to the traditional Adam optimizer.
Downloads 97.76k
Release Time : 3/2/2022

Model Overview

Based on the BERT-tiny architecture, fine-tuned with the M-FAC optimizer on the QNLI dataset for question-answering natural language inference tasks.

Model Features

M-FAC second-order optimization
Utilizes the advanced M-FAC second-order optimizer for fine-tuning, achieving better model performance compared to the traditional Adam optimizer.
Lightweight architecture
Based on the lightweight BERT-tiny architecture, suitable for deployment in resource-constrained environments.
Stable performance
Multiple runs show stable model performance with small standard deviations.

Model Capabilities

Question answering natural language inference
Text classification
Semantic understanding

Use Cases

Education
Automated Q&A system
Used in educational automated Q&A systems to determine the logical relationship between student questions and reference answers.
Achieved 81.54% accuracy on the QNLI validation set.
Customer service
Intelligent customer service
Used to assess the relevance between user questions and knowledge base answers.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase