D

Distilbert Base Uncased Finetuned Ner

Developed by tiennvcs
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset, suitable for entity tagging tasks in English text.
Downloads 15
Release Time : 3/24/2022

Model Overview

This model is a lightweight version based on DistilBERT, specifically fine-tuned for Named Entity Recognition (NER) tasks. It can identify entities such as person names, locations, and organization names in text, making it suitable for information extraction and text analysis scenarios.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it significantly reduces model size and computational requirements while maintaining high performance.
High-precision NER
Achieves an F1 score of 93.1% on the standard conll2003 test set, demonstrating excellent performance.
Fast Inference
The distilled architecture design enables faster inference speed compared to the original BERT.

Model Capabilities

Named Entity Recognition
Text Token Classification
Information Extraction

Use Cases

Information Extraction
News Entity Extraction
Automatically identify people, places, and organizations from news text.
Accurately marks entity types in the text.
Document Analysis
Process professional term recognition in legal or medical documents.
Helps quickly locate key information.
Data Preprocessing
Knowledge Graph Construction
Provides entity recognition preprocessing for knowledge graph generation.
Improves the efficiency of knowledge graph construction.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase