D

Distilbert Base Uncased Finetuned Ner

Developed by lucasmtz
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Downloads 14
Release Time : 3/2/2022

Model Overview

This model is a lightweight version based on DistilBERT, specifically fine-tuned for named entity recognition tasks. It performs excellently on the CoNLL-2003 dataset and is suitable for entity labeling tasks in English text.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is smaller and faster than the standard BERT model while maintaining good performance.
High-precision NER
Achieves an F1 score of 93.1% on the CoNLL-2003 test set, demonstrating excellent performance.
Fast Inference
The distilled architecture design enables faster model inference, making it suitable for production environment deployment.

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Labeling

Use Cases

Information Extraction
News Entity Extraction
Identify entities such as person names, locations, and organization names from news text.
Accuracy 98.3%, F1 score 93.1%
Biomedical Text Processing
Identify professional terms and entities in medical literature.
Data Preprocessing
Knowledge Graph Construction
Provide entity recognition preprocessing for knowledge graph construction.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase