D

Distilbert Base Uncased Finetuned Ner

Developed by indridinn
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset, featuring efficient inference performance and high accuracy.
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is a lightweight version based on DistilBERT, specifically fine-tuned for Named Entity Recognition (NER) tasks. It maintains high performance while significantly reducing model size and computational resource requirements.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller than standard BERT models, 60% faster in inference, while maintaining over 95% performance.
High Accuracy
Achieves an F1 score of 93.22% and an accuracy of 98.36% on the conll2003 test set.
Quick Deployment
Small model size makes it suitable for deployment in resource-constrained environments and supports batch inference.

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Boundary Detection
Entity Type Classification

Use Cases

Information Extraction
News Entity Extraction
Extract key entities such as person names, locations, and organization names from news texts.
Accurately identifies four types of entities (PER, LOC, ORG, MISC) in texts.
Document Processing
Contract Key Information Extraction
Automatically identify important entities in contracts such as signing parties, dates, and amounts.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase