Distilbert Base Multilingual Cased Finetuned Conll2003 Ner
This is a multilingual model based on DistilBERT, specifically fine-tuned for named entity recognition tasks on the CoNLL 2003 dataset.
Sequence Labeling
Transformers Supports Multiple Languages#Multilingual NER#High-precision entity recognition#CoNLL2003 optimization

Downloads 73
Release Time : 3/2/2022
Model Overview
This model is a named entity recognition model based on the distilbert-base-multilingual-cased architecture, supporting entity recognition tasks in multiple languages such as English, German, Dutch, and Spanish.
Model Features
Multilingual Support
Supports named entity recognition in multiple languages including English, German, Dutch, and Spanish
Efficient Model
Based on the DistilBERT architecture, it reduces model size and computational requirements while maintaining performance
High Accuracy
Achieved an excellent F1 score of 0.9409 on the CoNLL 2003 dataset
Model Capabilities
Named Entity Recognition
Multilingual Text Processing
Entity Classification
Use Cases
Information Extraction
News Article Entity Extraction
Identify entities such as person names, locations, and organization names from news articles
99.02% accuracy
Multilingual Document Processing
Handle entity recognition tasks in multilingual mixed documents
Featured Recommended AI Models
Š 2025AIbase