Distilbert Base Uncased Finetuned Ner
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the CoNLL2003 dataset, suitable for entity labeling tasks in English text.
Downloads 15
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of DistilBERT specifically designed for Named Entity Recognition (NER) tasks. It can identify entities such as person names, locations, and organization names in text, performing excellently on the CoNLL2003 evaluation set.
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller than standard BERT while retaining 95% of its performance.
High-precision NER
Achieves an F1 score of 0.9308 on the standard CoNLL2003 test set.
Fast Inference
The distilled architecture optimizes inference speed, making it suitable for production environment deployment.
Model Capabilities
Named Entity Recognition
Text Token Classification
Entity Boundary Detection
Use Cases
Information Extraction
News Entity Extraction
Automatically identifies people, locations, and organizations from news articles.
Accuracy of 98.36%, effectively supporting news classification systems.
Knowledge Graph Construction
Knowledge Graph Entity Labeling
Provides high-quality entity labeling for knowledge graph construction.
F1 score of 0.9308, ensuring balanced entity recognition.
Featured Recommended AI Models
Š 2025AIbase