D

Distilbert Base Uncased Finetuned Ner

Developed by Udi-Aharon
This model is a lightweight version based on DistilBERT, fine-tuned on the conll2003 dataset for named entity recognition tasks.
Downloads 15
Release Time : 5/10/2022

Model Overview

This is a fine-tuned DistilBERT model specifically designed for Named Entity Recognition (NER) tasks, demonstrating excellent performance on the conll2003 dataset.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is more lightweight and efficient than standard BERT models while maintaining good performance.
High-precision NER
Achieves 92.4% precision and 93.5% recall on the conll2003 dataset.
Fast Inference
Due to the DistilBERT architecture, inference speed is faster than the full BERT model.

Model Capabilities

Named Entity Recognition
Text Token Classification
Entity Extraction

Use Cases

Information Extraction
News Entity Extraction
Identify entities such as person names, locations, and organization names from news texts.
F1 score reaches 0.9295
Document Analysis
Process key entity recognition in legal or medical documents.
Data Preprocessing
Knowledge Graph Construction
Extract entity information for knowledge graph construction.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase