D

Distilbert Base Uncased Finetuned Ner

Developed by kinanmartin
This model is a lightweight version based on DistilBERT, fine-tuned for Named Entity Recognition (NER) tasks on a toy dataset.
Downloads 16
Release Time : 7/8/2022

Model Overview

This model is a fine-tuned version of distilbert-base-uncased on a toy dataset, primarily used for token classification tasks such as Named Entity Recognition.

Model Features

Lightweight Model
Based on the DistilBERT architecture, it is smaller and faster than the original BERT model while maintaining high performance.
High Accuracy
Achieved an accuracy of 0.9640 and an F1 score of 0.8544 on the evaluation set, demonstrating excellent performance.
Efficient Training
Requires only 3 training epochs to achieve good performance, with high training efficiency.

Model Capabilities

Named Entity Recognition
Token Classification

Use Cases

Text Processing
Named Entity Recognition
Identify and classify named entities in text, such as person names, locations, organization names, etc.
Achieved an F1 score of 0.8544 on the toy dataset.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase