Bert Base Uncased Sparse 70 Unstructured
This is a pruned version of the BERT base model with 70% sparsity, suitable for fine-tuning on downstream tasks.
Downloads 28
Release Time : 3/2/2022
Model Overview
This model is a pruned version of the BERT base model with built-in sparsity, suitable for natural language processing tasks.
Model Features
High Sparsity
The model has been pruned to achieve 70% sparsity, reducing computational resource requirements.
Sparsity Preservation
Built-in sparsity allows zero-value weights to remain unchanged during fine-tuning via masking.
BERT-based Architecture
Based on the BERT base model, inheriting its powerful natural language processing capabilities.
Model Capabilities
Text classification
Named entity recognition
Question answering
Text similarity calculation
Use Cases
Natural Language Processing
Sentiment Analysis
Analyze the sentiment polarity of text.
Text Classification
Classify text into predefined categories.
Featured Recommended AI Models