D

Distilbert NER

Developed by dslim
A lightweight named entity recognition model fine-tuned based on DistilBERT, balancing performance and efficiency
Downloads 48.95k
Release Time : 1/25/2024

Model Overview

A distilled version of the BERT model optimized for named entity recognition tasks, capable of identifying four types of entities: LOC/ORG/PER/MISC

Model Features

Lightweight and Efficient
With only 66M parameters, it reduces the size by 40% compared to the original BERT, resulting in faster inference speed
Four-Class Entity Recognition
Accurately identifies entities such as locations (LOC), organizations (ORG), persons (PER), and miscellaneous (MISC)
CoNLL-2003 Benchmark
Fine-tuned on the standard NER dataset, achieving an F1 score of 0.9217

Model Capabilities

Text Entity Recognition
Continuous Entity Boundary Detection
News Domain Entity Extraction

Use Cases

Information Extraction
News Text Analysis
Extract institutions, person names, and geographical locations from news reports
Example input and output can be found in the model card
Knowledge Graph Construction
Entity Relation Extraction
Serves as a pre-processing entity recognition module for knowledge graph construction
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase