Distilbert Base Uncased Finetuned Ner
This is a lightweight model based on DistilBERT, fine-tuned on the CoNLL-2003 Named Entity Recognition (NER) task.
Downloads 15
Release Time : 5/9/2022
Model Overview
This model is specifically designed for named entity recognition tasks, capable of identifying entities such as person names, locations, and organization names in text.
Model Features
Lightweight and Efficient
Based on the DistilBERT architecture, it is 40% smaller than standard BERT while retaining 95% of its performance.
High Accuracy
Achieves an F1 score of 90.8% on the CoNLL-2003 test set.
Fast Inference
The distilled architecture enables faster inference compared to the full BERT model.
Model Capabilities
Named Entity Recognition
Text Token Classification
Entity Boundary Detection
Entity Type Classification
Use Cases
Information Extraction
News Article Entity Extraction
Extract person names, organization names, and locations from news text.
Accuracy 97.94%
Document Automation Processing
Automatically identify key entities in contracts or legal documents.
Knowledge Graph Construction
Knowledge Graph Entity Linking
Identify entities in text for knowledge graph construction.
Featured Recommended AI Models
Š 2025AIbase