D

Distilbert Base Uncased Finetuned Ner

Developed by srosy
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 16
Release Time : 3/2/2022

Model Overview

This model is a fine-tuned version of DistilBERT, specifically designed for Named Entity Recognition (NER) tasks, demonstrating excellent performance on the conll2003 dataset.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is smaller and faster than standard BERT models while maintaining high performance
High Accuracy
Achieves 98.44% accuracy and 93.23% F1 score on the conll2003 dataset
Fast Inference
The distilled model design enables faster inference speed compared to the full BERT model

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Extraction

Use Cases

Natural Language Processing
News Entity Extraction
Identify entities such as person names, locations, and organization names from news text
High accuracy in recognizing various named entities
Document Information Extraction
Extract key entity information from legal or business documents
Can help automate document processing workflows
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase