D

Distilbert Base Uncased Finetuned Ner

Developed by roschmid
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset, featuring efficient inference performance and high accuracy.
Downloads 15
Release Time : 5/23/2022

Model Overview

This model is a lightweight version based on DistilBERT, specifically fine-tuned for Named Entity Recognition (NER) tasks. It can identify entities such as person names, locations, and organization names in text, making it suitable for information extraction and text analysis tasks.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller than the standard BERT model while retaining 95% of its performance.
High Accuracy
Achieves an F1 score of 92.7% and an accuracy of 98.3% on the conll2003 test set.
Fast Inference
The distilled model design enables 60% faster inference speed compared to the full BERT model.

Model Capabilities

Named Entity Recognition
Text token classification
Information extraction

Use Cases

Information Extraction
News Entity Extraction
Extract person names, locations, and organization names from news articles
Accurately identifies key entities in the text
Document Analysis
Process legal or medical documents to identify key entities
Helps quickly locate important information in documents
Data Preprocessing
Knowledge Graph Construction
Provide entity recognition functionality for building knowledge graphs
Serves as the first step in the knowledge extraction pipeline
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase