D

Distilbert Base Uncased Finetuned Ner

Developed by mcdzwil
A lightweight named entity recognition model based on DistilBERT, fine-tuned on specific datasets, suitable for English text entity recognition tasks.
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is a lightweight version based on DistilBERT, fine-tuned for Named Entity Recognition (NER) tasks. It maintains high performance while reducing model size and computational resource requirements.

Model Features

Lightweight and Efficient
Based on the DistilBERT architecture, it is 40% smaller and 60% faster than standard BERT models while retaining 97% of the performance.
High-precision Recognition
Achieves 91.71% precision and 80.03% F1 score on the evaluation set, demonstrating excellent performance.
Easy to Deploy
The model has a small size, making it suitable for deployment in production environments with lower hardware requirements.

Model Capabilities

Text Entity Recognition
English Named Entity Extraction
Sequence Labeling

Use Cases

Information Extraction
News Entity Extraction
Identify and extract entities such as person names, locations, and organization names from news texts.
93.16% accuracy
Medical Text Processing
Identify professional terms such as diseases, medications, and symptoms in medical reports.
Business Intelligence
Customer Feedback Analysis
Extract product names and key features from customer reviews.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase