D

Distilbert Base Uncased Finetuned Ner

Developed by ACSHCSE
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 15
Release Time : 4/12/2022

Model Overview

This model is a lightweight version based on DistilBERT, specifically fine-tuned for Named Entity Recognition (NER) tasks. It performs excellently on the standard conll2003 dataset and is suitable for entity recognition tasks in text.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller than the standard BERT model while retaining 95% of its performance.
High-precision NER
Achieves an F1 score of 92.9% on the conll2003 dataset, demonstrating excellent performance.
Fast Inference
The distilled model architecture provides faster inference speeds, making it suitable for production environment deployment.

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Boundary Detection
Entity Type Classification

Use Cases

Information Extraction
News Article Entity Extraction
Identify entities such as person names, locations, and organization names from news text.
Can accurately identify named entities in text with an F1 score of 92.9%.
Document Automation Processing
Automatically extract key entity information from contracts or legal documents.
Knowledge Graph Construction
Knowledge Graph Entity Extraction
Extract entities from unstructured text for knowledge graph construction.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase