D

Distilbert Token Itr0 0.0001 Editorials 01 03 2022 15 20 12

Developed by ali2066
DistilBERT model fine-tuned based on bert-base-uncased, primarily used for text classification tasks
Downloads 22
Release Time : 3/2/2022

Model Overview

This model is a fine-tuned version based on bert-base-uncased on an unspecified dataset, suitable for text classification tasks with high accuracy.

Model Features

Efficient Fine-tuning
Fine-tuned based on bert-base-uncased, retaining the high performance of the BERT model while reducing computational resource requirements.
High Accuracy
Achieved an accuracy of 0.9707 on the evaluation set, demonstrating excellent performance.
Lightweight
As a DistilBERT model, it is more lightweight compared to the original BERT model, suitable for resource-constrained environments.

Model Capabilities

Text Classification
Natural Language Processing

Use Cases

Text Analysis
Editorial Classification
Can be used to classify editorial articles, identifying different topics or sentiment tendencies.
Accuracy 0.9707
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase