D

Distilbert Base Uncased Finetuned Ner

Developed by Hank
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is a lightweight version of DistilBERT, specifically fine-tuned for Named Entity Recognition (NER) tasks. It maintains high performance while being smaller and faster than the original BERT model.

Model Features

Lightweight and Efficient
Based on the DistilBERT architecture, it is 40% smaller and 60% faster than the standard BERT model while retaining 97% of its performance.
High-precision NER
Achieves an F1 score of 93.14% and an accuracy of 98.39% on the conll2003 dataset.
Rapid Fine-tuning
Requires only 3 training epochs to achieve high performance, with training loss decreasing from 0.243 to 0.0612.

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Boundary Detection
Entity Type Classification

Use Cases

Information Extraction
Entity Extraction from News Articles
Identify entities such as person names, locations, and organization names from news texts
Accurately recognizes over 93% of entities
Document Automation Processing
Automatically extract key entity information from contracts or legal documents
Knowledge Graph Construction
Entity Linking for Knowledge Graphs
Provide foundational entity recognition for knowledge graph construction
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase