D

Distilbert Base Uncased Finetuned Ner

Developed by dbsamu
A named entity recognition model fine-tuned on the wikiann dataset based on the distilbert-base-uncased model, achieving an excellent F1 score of 0.8210 on the evaluation set.
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is specifically designed for named entity recognition tasks, capable of identifying named entities in text (such as person names, locations, organization names, etc.).

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it is 40% smaller than the standard BERT model while maintaining 95% of its performance.
High Accuracy
Achieves an accuracy of 0.9204 and an F1 score of 0.8210 on the wikiann dataset.
Fast Inference
The distilled model design enables inference speeds 60% faster than the full BERT model.

Model Capabilities

Named Entity Recognition
Text Token Classification
English Text Processing

Use Cases

Information Extraction
Entity Extraction from News Articles
Extract key information such as person names, locations, and organization names from news articles.
Accurately identifies named entities in text.
Knowledge Graph Construction
Entity Recognition for Knowledge Graphs
Identify and classify entities in text for knowledge graph construction.
Provides high-quality entity recognition results.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase