D

Distilbert Base Uncased Finetuned TT2 Exam

Developed by roschmid
This model is a fine-tuned version of distilbert-base-uncased on the conll2003 dataset, designed for token classification tasks.
Downloads 15
Release Time : 5/23/2022

Model Overview

This model is a fine-tuned DistilBERT model specifically designed for token classification tasks, demonstrating excellent performance on the conll2003 dataset.

Model Features

Efficient Fine-tuning
Based on the DistilBERT architecture, it has been efficiently fine-tuned on the conll2003 dataset, significantly improving token classification performance.
High Performance
Achieved high precision (0.9222), recall (0.9369), and F1 score (0.9295) on the evaluation set.
Lightweight
The DistilBERT architecture is more lightweight compared to the original BERT model, making it suitable for resource-constrained environments.

Model Capabilities

Token Classification
Natural Language Processing
Entity Recognition

Use Cases

Natural Language Processing
Named Entity Recognition
Used to identify named entities in text, such as person names, locations, and organization names.
Performed excellently on the conll2003 dataset, achieving an F1 score of 0.9295.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase