Distilbert Base Uncased Finetuned Ner
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 16
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of DistilBERT specifically for Named Entity Recognition (NER) tasks. It was trained on the CoNLL-2003 dataset and can identify entities such as person names, locations, and organization names in text.
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it is smaller and faster than standard BERT models while maintaining high accuracy.
High Accuracy
Achieves an F1 score of 0.9304 and accuracy of 0.9837 on the CoNLL-2003 test set.
Fast Inference
The distilled architecture optimizes inference speed, making it suitable for production deployment.
Model Capabilities
Text Entity Recognition
Named Entity Labeling
Sequence Labeling
Use Cases
Information Extraction
News Text Analysis
Extract key entities such as person names, locations, and organization names from news articles.
Accurately identifies various named entities in text.
Document Processing
Automatically label key entities in contracts or legal documents.
Improves document processing efficiency and reduces manual labeling efforts.
Knowledge Graph Construction
Entity Relation Extraction
Identify entities in text as a preliminary step for knowledge graph construction.
Provides the foundation for subsequent relation extraction and knowledge graph building.
Featured Recommended AI Models
Š 2025AIbase