B

Bert Base Uncased Mnli Sparse 70 Unstructured No Classifier

Developed by Intel
This model is fine-tuned from bert-base-uncased-sparse-70-unstructured on the MNLI task (GLUE benchmark), with the classifier layer removed for easier loading into other downstream tasks for training.
Downloads 17
Release Time : 3/2/2022

Model Overview

This is a sparse BERT-based model specifically fine-tuned for the MNLI task, with the classifier layer removed to facilitate transfer to other tasks.

Model Features

Sparse Architecture
Employs 70% unstructured sparsity in BERT architecture, improving model efficiency
Transferability
Classifier layer removed for easier transfer to other downstream tasks
Multi-task Adaptation
Transfer performance validated on multiple GLUE tasks

Model Capabilities

Natural Language Understanding
Text Classification
Sentence Similarity Calculation
Question Answering Systems

Use Cases

Text Analysis
Natural Language Inference
Determine the relationship between two sentences (entailment/contradiction/neutral)
Matched set accuracy 82.5%, mismatched set accuracy 83.3%
Transfer Learning
Downstream Task Adaptation
Can be transferred to QQP, QNLI, SST-2 and other tasks
Achieved 90.2% accuracy on QQP task
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase