D

Distilbert Base Uncased Finetuned Ner

Developed by thomaszz
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is a fine-tuned version of DistilBERT, specifically designed for named entity recognition tasks. It performs excellently on the conll2003 dataset, with efficient inference speed and a compact model size.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller and 60% faster than standard BERT while maintaining over 90% performance.
High-precision NER
Achieves an F1 score of 93% on the conll2003 dataset, demonstrating excellent performance.
Fast Inference
The distilled architecture optimizes inference speed, making it suitable for production environment deployment.

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Extraction

Use Cases

Information Extraction
News Entity Extraction
Extract entities such as person names, locations, and organization names from news texts.
Accurately identifies over 93% of entities.
Document Analysis
Process professional term recognition in legal or medical documents.
Data Preprocessing
Knowledge Graph Construction
Automatically extract entities for knowledge graph systems.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase