D

Distilbert Base Uncased Finetuned Ner

Developed by SnailPoo
This model is a Named Entity Recognition (NER) model fine-tuned based on the DistilBERT base version, trained on an unknown dataset with an evaluation set F1 score of 0.8545.
Downloads 15
Release Time : 4/21/2022

Model Overview

A lightweight NER model based on the distilled BERT architecture, suitable for named entity recognition tasks in English text.

Model Features

Lightweight and Efficient
Utilizes a distilled BERT architecture to reduce computational resource requirements while maintaining performance.
High Accuracy
Achieves 96.38% accuracy and an F1 score of 0.8545 on the evaluation set.
Fast Training
Requires only 3 training epochs to achieve good results, suitable for rapid deployment.

Model Capabilities

English text entity recognition
Sequence labeling
Entity classification

Use Cases

Information Extraction
News Entity Extraction
Identify entities such as person names, locations, and organization names from news text.
Accurately marks key entity information in the text.
Document Processing
Contract Key Information Extraction
Automatically identifies key entities in contract documents such as party names, dates, and amounts.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase