Distilbert Base Uncased Finetuned Ner
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset, suitable for entity labeling tasks in English text.
Downloads 16
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of DistilBERT specifically designed for Named Entity Recognition (NER) tasks. It is trained on the conll2003 dataset and can identify entities such as person names, locations, and organization names in text.
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller and 60% faster than standard BERT while maintaining 97% of its performance.
High Accuracy
Achieves an accuracy of 0.9843 and an F1 score of 0.9327 on the conll2003 test set.
Ready-to-Use Model
Pre-trained and fine-tuned for NER tasks, it can be directly used for entity recognition applications.
Model Capabilities
Named Entity Recognition
Text Token Classification
Entity Boundary Detection
Entity Type Classification
Use Cases
Information Extraction
News Entity Extraction
Automatically identify people, places, and organization names from news articles
Accurately marks key entities in the text
Document Analysis
Process professional term recognition in legal or medical documents
Helps quickly locate important entity information in documents
Data Preprocessing
Knowledge Graph Construction
Provide entity recognition preprocessing for knowledge graph generation
Improves the automation level of knowledge graph construction
Featured Recommended AI Models
Š 2025AIbase