Distilbert Base Uncased Finetuned Ner
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 16
Release Time : 6/10/2022
Model Overview
This model is a lightweight version based on DistilBERT, specifically designed for named entity recognition tasks. After fine-tuning on the conll2003 dataset, it can efficiently identify named entities in text.
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it is more lightweight than standard BERT models while maintaining high performance.
High Accuracy
Achieves an F1 score of 93.2% on the conll2003 test set, demonstrating excellent performance.
Fast Training
Only requires 3 training epochs to achieve good results, with high training efficiency.
Model Capabilities
Named Entity Recognition
Text Token Classification
Use Cases
Information Extraction
News Entity Extraction
Extract entities such as person names, locations, and organization names from news text.
Accurately identifies 93.2% of entities.
Document Analysis
Process legal or medical documents to identify key entity information.
Data Preprocessing
NLP Preprocessing
Provide entity recognition preprocessing for downstream NLP tasks.
Featured Recommended AI Models
Š 2025AIbase