# Named Entity Recognition

Universal NER UniNER 7B All Bnb 4bit Smashed
PrunaAI's compressed version of the UniNER-7B-all model, significantly reducing memory usage and energy consumption through quantization techniques while maintaining good named entity recognition capabilities.
Large Language Model Transformers
U
PrunaAI
22
1
Illuni Llama 2 Ko 7b
MIT
Korean large language model based on beomi/llama-2-ko-7b, suitable for Q&A tasks
Large Language Model Transformers Korean
I
illuni
65
2
Electra Small Ner
Apache-2.0
A named entity recognition model fine-tuned based on electra-small, capable of identifying three types of entities: locations, person names, and organizations
Sequence Labeling Transformers English
E
rv2307
74
3
Dictabert Joint
State-of-the-art multi-task joint parsing BERT model for Modern Hebrew, supporting five tasks: prefix segmentation, morphological disambiguation, lexical analysis, syntactic parsing, and named entity recognition
Sequence Labeling Transformers Other
D
dicta-il
3,678
2
Bert Finetuned Ner
Apache-2.0
A named entity recognition model fine-tuned on the CoNLL2003 dataset based on BERT-base-cased
Sequence Labeling Transformers
B
fundrais123
23
1
Bertimbau
MIT
A BERT model pre-trained for Brazilian Portuguese, excelling in various NLP tasks
Large Language Model Other
B
tubyneto
38
1
Afrolm Active Learning
AfroLM is a pretrained language model optimized for 23 African languages, employing an active learning framework to achieve high performance with minimal data
Large Language Model Transformers Other
A
bonadossou
132
8
Roberta Base Mnli Uf Ner 1024 Train V0
MIT
A fine-tuned version of the RoBERTa-base model on the MNLI dataset, suitable for natural language inference tasks
Large Language Model Transformers
R
mariolinml
26
1
Bert Finetuned Ner
Apache-2.0
BERT-base-cased fine-tuned Named Entity Recognition (NER) model
Sequence Labeling Transformers
B
ankitsharma
14
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
This model is a lightweight version based on DistilBERT, fine-tuned for Named Entity Recognition (NER) tasks on a toy dataset.
Sequence Labeling Transformers
D
kinanmartin
16
0
Astrobert
MIT
A language model specifically designed for astronomy and astrophysics, developed by the NASA/ADS team, supporting masked token filling, named entity recognition, and text classification tasks.
Large Language Model Transformers English
A
adsabs
215
14
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on a specific dataset
Sequence Labeling Transformers
D
IsaMaks
15
0
Bert Finetuned Ner 0
Apache-2.0
This model is a fine-tuned version of bert-base-cased on an unknown dataset, primarily used for Named Entity Recognition (NER) tasks.
Sequence Labeling Transformers
B
mariolinml
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Sequence Labeling Transformers
D
hossay
16
0
Bert Base Swedish Cased Ner
Swedish BERT base model released by the National Library of Sweden/KBLab, trained on multi-source texts
Large Language Model Other
B
KB
20.77k
8
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Sequence Labeling Transformers
D
murdockthedude
15
0
Distilbert Base Uncased Finetuned TT2 Exam
Apache-2.0
This model is a fine-tuned version of distilbert-base-uncased on the conll2003 dataset, designed for token classification tasks.
Sequence Labeling Transformers
D
roschmid
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset, featuring efficient inference performance and high accuracy.
Sequence Labeling Transformers
D
roschmid
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
Named Entity Recognition (NER) model fine-tuned based on DistilBERT-base-uncased
Sequence Labeling Transformers
D
chancar
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
This model is a lightweight version based on DistilBERT, fine-tuned on the conll2003 dataset for named entity recognition tasks.
Sequence Labeling Transformers
D
Udi-Aharon
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
This is a lightweight model based on DistilBERT, fine-tuned on the CoNLL-2003 Named Entity Recognition (NER) task.
Sequence Labeling Transformers
D
guhuawuli
15
0
Fi Core News Sm
CPU-optimized Finnish language processing pipeline with NLP features including token classification and dependency parsing
Sequence Labeling Other
F
spacy
45
0
Ko Core News Sm
Korean processing pipeline optimized for CPU, including tokenization, part-of-speech tagging, dependency parsing, named entity recognition, etc.
Sequence Labeling Korean
K
spacy
62
1
Test
Apache-2.0
This model is a fine-tuned version based on hfl/chinese-bert-wwm-ext on the conll2003 dataset, designed for token classification tasks.
Sequence Labeling Transformers
T
vegetable
15
0
Distilbert Base Uncased Finetuned Ner Final
A lightweight Named Entity Recognition (NER) model based on the DistilBERT architecture, fine-tuned for specific tasks
Sequence Labeling Transformers
D
Lilya
15
0
Bert Base Cased Ner Conll2003
Apache-2.0
A named entity recognition model fine-tuned on the CoNLL2003 dataset based on bert-base-cased
Sequence Labeling Transformers
B
kamalkraj
38
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
This model is a Named Entity Recognition (NER) model fine-tuned based on the DistilBERT base version, trained on an unknown dataset with an evaluation set F1 score of 0.8545.
Sequence Labeling Transformers
D
SnailPoo
15
0
Cybonto Distilbert Base Uncased Finetuned Ner Wnut17
Apache-2.0
This model is a named entity recognition (NER) model fine-tuned on the wnut_17 dataset based on distilbert-base-uncased, used to identify specific entity categories in text.
Sequence Labeling Transformers
C
theResearchNinja
18
0
Cybonto Distilbert Base Uncased Finetuned Ner FewNerd
Apache-2.0
This model is a Named Entity Recognition (NER) model fine-tuned on the few_nerd dataset based on distilbert-base-uncased, achieving an F1 score of 0.7621 on the evaluation set.
Sequence Labeling Transformers
C
theResearchNinja
17
0
Afro Xlmr Mini
MIT
AfroXLMR-mini is created by adapting the XLM-R-miniLM model through masked language model (MLM) training on 17 African languages, covering major African language families and three high-resource languages (Arabic, French, and English).
Large Language Model Transformers
A
Davlan
66
0
Afro Xlmr Small
MIT
A compact variant of XLM-R optimized for 17 African languages, enhanced through vocabulary reduction and multilingual adaptive training
Large Language Model Transformers
A
Davlan
33
1
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Sequence Labeling Transformers
D
ACSHCSE
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A named entity recognition model fine-tuned on the conll2003 dataset based on the DistilBERT-base-uncased model, excelling in NER tasks.
Sequence Labeling Transformers
D
issifuamajeed
947
0
Uk Ner
MIT
A Ukrainian named entity recognition model fine-tuned based on XLM-RoBERTa-Uk, capable of identifying person names, locations, and organization names in text.
Sequence Labeling Transformers Other
U
ukr-models
84
4
Mobilebert Finetuned Ner
MIT
Named Entity Recognition model fine-tuned based on MobileBERT architecture
Sequence Labeling Transformers
M
vumichien
39
0
Bert Base NER
MIT
BERT-base based named entity recognition model, capable of identifying four types of entities including locations, organizations, and person names
Sequence Labeling Transformers English
B
optimum
69
2
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset, suitable for entity tagging tasks in English text.
Sequence Labeling Transformers
D
tiennvcs
15
0
Distilbert Base Uncased Finetuned Combinedmodel1 Ner
Apache-2.0
This model is a fine-tuned version of distilbert-base-uncased on a specific dataset, primarily designed for Named Entity Recognition (NER) tasks.
Sequence Labeling Transformers
D
akshaychaudhary
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, fine-tuned on the conll2003 dataset
Sequence Labeling Transformers
D
MikhailGalperin
15
0
Bert Base Portuguese Cased
MIT
Pretrained BERT model for Brazilian Portuguese, achieving state-of-the-art performance in multiple NLP tasks
Large Language Model Other
B
neuralmind
257.25k
181
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase