Distilbert Base Uncased Finetuned Ner
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 15
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of DistilBERT specifically designed for Named Entity Recognition (NER) tasks, capable of identifying entities such as person names, locations, and organization names in text.
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it is smaller and faster than standard BERT models while maintaining high accuracy.
High Accuracy
Achieves an F1 score of 0.9366 and accuracy of 0.9845 on the conll2003 test set.
Fast Inference
The distilled model design enables faster inference speeds compared to the full BERT model.
Model Capabilities
Named Entity Recognition
Text Token Classification
Entity Extraction
Use Cases
Information Extraction
News Entity Extraction
Extract key entities such as person names, locations, and organization names from news texts.
Accurately identifies various named entities in the text.
Document Analysis
Process legal documents or business reports to extract key entity information.
Helps quickly build document indexes and knowledge graphs.
Data Preprocessing
NLP Pipeline Preprocessing
Provide entity recognition preprocessing for question-answering systems or search engines.
Enhances the performance of downstream tasks.
Featured Recommended AI Models
Š 2025AIbase