Test
T
Test
Developed by vegetable
This model is a fine-tuned version based on hfl/chinese-bert-wwm-ext on the conll2003 dataset, designed for token classification tasks.
Downloads 15
Release Time : 4/28/2022
Model Overview
This model is a token classification model based on the BERT architecture, primarily used for sequence labeling tasks such as named entity recognition.
Model Features
High Accuracy
Achieved an accuracy of 88.47% on the conll2003 evaluation set.
Balanced Performance
Precision (76.96%) and recall (83.96%) are well-balanced, with an F1 score reaching 80.31%.
Chinese Optimization
Fine-tuned based on the Chinese pre-trained model hfl/chinese-bert-wwm-ext.
Model Capabilities
Named Entity Recognition
Sequence Labeling
Text Token Classification
Use Cases
Natural Language Processing
Chinese Named Entity Recognition
Identify entities such as person names, locations, and organization names in Chinese text.
F1 score reaches 80.31%.
Information Extraction
Extract structured information from unstructured text.
đ Test
This model is a fine - tuned version of hfl/chinese-bert-wwm-ext on the conll2003 dataset. It can be used for token classification tasks and achieves good results on the evaluation set.
đ Documentation
Model Evaluation Results
It achieves the following results on the evaluation set:
- Loss: 0.7372
- Precision: 0.7696
- Recall: 0.8396
- F1: 0.8031
- Accuracy: 0.8847
Model Index
Property | Details |
---|---|
Model Name | test |
Task | Token Classification |
Dataset | conll2003 |
Precision | 0.7696078431372549 |
Recall | 0.839572192513369 |
F1 | 0.8030690537084398 |
Accuracy | 0.8847040737893928 |
đ§ Technical Details
Training Hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training Results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 2 | 1.9496 | 0.0 | 0.0 | 0.0 | 0.4889 |
No log | 2.0 | 4 | 1.6137 | 0.0 | 0.0 | 0.0 | 0.4919 |
No log | 3.0 | 6 | 1.3906 | 0.0 | 0.0 | 0.0 | 0.5650 |
No log | 4.0 | 8 | 1.2273 | 0.0652 | 0.0481 | 0.0554 | 0.6856 |
No log | 5.0 | 10 | 1.0565 | 0.2051 | 0.1711 | 0.1866 | 0.7125 |
No log | 6.0 | 12 | 0.9150 | 0.5094 | 0.4332 | 0.4682 | 0.7540 |
No log | 7.0 | 14 | 0.8051 | 0.5988 | 0.5187 | 0.5559 | 0.7679 |
No log | 8.0 | 16 | 0.7151 | 0.6707 | 0.5989 | 0.6328 | 0.7763 |
No log | 9.0 | 18 | 0.6334 | 0.6685 | 0.6364 | 0.6521 | 0.8086 |
No log | 10.0 | 20 | 0.5693 | 0.6957 | 0.6845 | 0.6900 | 0.8201 |
No log | 11.0 | 22 | 0.5192 | 0.7166 | 0.7166 | 0.7166 | 0.8363 |
No log | 12.0 | 24 | 0.4736 | 0.7135 | 0.7326 | 0.7230 | 0.8524 |
No log | 13.0 | 26 | 0.4448 | 0.6938 | 0.7754 | 0.7323 | 0.8555 |
No log | 14.0 | 28 | 0.4280 | 0.7177 | 0.8021 | 0.7576 | 0.8586 |
No log | 15.0 | 30 | 0.4179 | 0.7588 | 0.8075 | 0.7824 | 0.8663 |
No log | 16.0 | 32 | 0.4214 | 0.7356 | 0.8182 | 0.7747 | 0.8593 |
No log | 17.0 | 34 | 0.4070 | 0.7391 | 0.8182 | 0.7766 | 0.8616 |
No log | 18.0 | 36 | 0.4112 | 0.7586 | 0.8235 | 0.7897 | 0.8724 |
No log | 19.0 | 38 | 0.4530 | 0.7330 | 0.8075 | 0.7684 | 0.8693 |
No log | 20.0 | 40 | 0.4719 | 0.7766 | 0.8182 | 0.7969 | 0.8732 |
No log | 21.0 | 42 | 0.4886 | 0.7260 | 0.8075 | 0.7646 | 0.8632 |
No log | 22.0 | 44 | 0.5007 | 0.7217 | 0.8182 | 0.7669 | 0.8701 |
No log | 23.0 | 46 | 0.5169 | 0.7321 | 0.8182 | 0.7727 | 0.8762 |
No log | 24.0 | 48 | 0.5531 | 0.7238 | 0.8128 | 0.7657 | 0.8724 |
No log | 25.0 | 50 | 0.5895 | 0.7311 | 0.8289 | 0.7769 | 0.8655 |
No log | 26.0 | 52 | 0.5482 | 0.7330 | 0.8075 | 0.7684 | 0.8778 |
No log | 27.0 | 54 | 0.5361 | 0.7488 | 0.8128 | 0.7795 | 0.8832 |
No log | 28.0 | 56 | 0.5378 | 0.7427 | 0.8182 | 0.7786 | 0.8847 |
No log | 29.0 | 58 | 0.5543 | 0.7371 | 0.8396 | 0.7850 | 0.8824 |
No log | 30.0 | 60 | 0.5564 | 0.7585 | 0.8396 | 0.7970 | 0.8839 |
No log | 31.0 | 62 | 0.5829 | 0.7235 | 0.8396 | 0.7772 | 0.8724 |
No log | 32.0 | 64 | 0.5974 | 0.7269 | 0.8396 | 0.7792 | 0.8716 |
No log | 33.0 | 66 | 0.5750 | 0.7610 | 0.8342 | 0.7959 | 0.8839 |
No log | 34.0 | 68 | 0.5887 | 0.7723 | 0.8342 | 0.8021 | 0.8878 |
No log | 35.0 | 70 | 0.6219 | 0.7441 | 0.8396 | 0.7889 | 0.8747 |
No log | 36.0 | 72 | 0.6676 | 0.7269 | 0.8396 | 0.7792 | 0.8632 |
No log | 37.0 | 74 | 0.6517 | 0.7452 | 0.8289 | 0.7848 | 0.8693 |
No log | 38.0 | 76 | 0.6346 | 0.7828 | 0.8289 | 0.8052 | 0.8862 |
No log | 39.0 | 78 | 0.6239 | 0.7839 | 0.8342 | 0.8083 | 0.8855 |
No log | 40.0 | 80 | 0.6360 | 0.7277 | 0.8289 | 0.775 | 0.8762 |
No log | 41.0 | 82 | 0.6645 | 0.7336 | 0.8396 | 0.7830 | 0.8701 |
No log | 42.0 | 84 | 0.6611 | 0.7406 | 0.8396 | 0.7870 | 0.8747 |
No log | 43.0 | 86 | 0.6707 | 0.7488 | 0.8289 | 0.7868 | 0.8762 |
No log | 44.0 | 88 | 0.6901 | 0.7277 | 0.8289 | 0.775 | 0.8709 |
No log | 45.0 | 90 | 0.6911 | 0.7393 | 0.8342 | 0.7839 | 0.8709 |
No log | 46.0 | 92 | 0.6540 | 0.7761 | 0.8342 | 0.8041 | 0.8878 |
No log | 47.0 | 94 | 0.6381 | 0.7761 | 0.8342 | 0.8041 | 0.8916 |
No log | 48.0 | 96 | 0.6285 | 0.7745 | 0.8449 | 0.8082 | 0.8885 |
No log | 49.0 | 98 | 0.6449 | 0.7692 | 0.8556 | 0.8101 | 0.8862 |
No log | 50.0 | 100 | 0.6809 | 0.7442 | 0.8556 | 0.7960 | 0.8732 |
No log | 51.0 | 102 | 0.6898 | 0.7395 | 0.8503 | 0.7910 | 0.8716 |
No log | 52.0 | 104 | 0.6897 | 0.75 | 0.8503 | 0.7970 | 0.8762 |
No log | 53.0 | 106 | 0.6714 | 0.7656 | 0.8556 | 0.8081 | 0.8855 |
No log | 54.0 | 108 | 0.6612 | 0.7692 | 0.8556 | 0.8101 | 0.8855 |
No log | 55.0 | 110 | 0.6583 | 0.7692 | 0.8556 | 0.8101 | 0.8855 |
No log | 56.0 | 112 | 0.6648 | 0.7692 | 0.8556 | 0.8101 | 0.8855 |
No log | 57.0 | 114 | 0.6757 | 0.7656 | 0.8556 | 0.8081 | 0.8832 |
No log | 58.0 | 116 | 0.6803 | 0.7656 | 0.8556 | 0.8081 | 0.8839 |
No log | 59.0 | 118 | 0.6834 | 0.7692 | 0.8556 | 0.8101 | 0.8862 |
No log | 60.0 | 120 | 0.6889 | 0.7833 | 0.8503 | 0.8154 | 0.8878 |
No log | 61.0 | 122 | 0.6963 | 0.7772 | 0.8396 | 0.8072 | 0.8862 |
No log | 62.0 | 124 | 0.7057 | 0.7772 | 0.8396 | 0.8072 | 0.8862 |
No log | 63.0 | 126 | 0.7212 | 0.7910 | 0.8503 | 0.8196 | 0.8862 |
No log | 64.0 | 128 | 0.7334 | 0.7833 | 0.8503 | 0.8154 | 0.8824 |
No log | 65.0 | 130 | 0.7398 | 0.7833 | 0.8503 | 0.8154 | 0.8801 |
No log | 66.0 | 132 | 0.7400 | 0.7833 | 0.8503 | 0.8154 | 0.8809 |
No log | 67.0 | 134 | 0.7345 | 0.7783 | 0.8449 | 0.8103 | 0.8855 |
No log | 68.0 | 136 | 0.7270 | 0.79 | 0.8449 | 0.8165 | 0.8870 |
No log | 69.0 | 138 | 0.7245 | 0.7839 | 0.8342 | 0.8083 | 0.8862 |
No log | 70.0 | 140 | 0.7260 | 0.7868 | 0.8289 | 0.8073 | 0.8847 |
No log | 71.0 | 142 | 0.7275 | 0.7817 | 0.8235 | 0.8021 | 0.8839 |
No log | 72.0 | 144 | 0.7283 | 0.7778 | 0.8235 | 0.8000 | 0.8832 |
No log | 73.0 | 146 | 0.7296 | 0.78 | 0.8342 | 0.8062 | 0.8847 |
No log | 74.0 | 148 | 0.7344 | 0.7734 | 0.8396 | 0.8051 | 0.8832 |
No log | 75.0 | 150 | 0.7314 | 0.7745 | 0.8449 | 0.8082 | 0.8824 |
No log | 76.0 | 152 | 0.7299 | 0.7794 | 0.8503 | 0.8133 | 0.8832 |
No log | 77.0 | 154 | 0.7282 | 0.7794 | 0.8503 | 0.8133 | 0.8839 |
No log | 78.0 | 156 | 0.7252 | 0.7783 | 0.8449 | 0.8103 | 0.8839 |
No log | 79.0 | 158 | 0.7216 | 0.7756 | 0.8503 | 0.8112 | 0.8855 |
No log | 80.0 | 160 | 0.7194 | 0.7756 | 0.8503 | 0.8112 | 0.8870 |
No log | 81.0 | 162 | 0.7191 | 0.7756 | 0.8503 | 0.8112 | 0.8878 |
No log | 82.0 | 164 | 0.7201 | 0.7696 | 0.8396 | 0.8031 | 0.8862 |
No log | 83.0 | 166 | 0.7211 | 0.7696 | 0.8396 | 0.8031 | 0.8862 |
No log | 84.0 | 168 | 0.7222 | 0.7696 | 0.8396 | 0.8031 | 0.8862 |
No log | 85.0 | 170 | 0.7220 | 0.7696 | 0.8396 | 0.8031 | 0.8862 |
No log | 86.0 | 172 | 0.7239 | 0.7734 | 0.8396 | 0.8051 | 0.8870 |
No log | 87.0 | 174 | 0.7291 | 0.7772 | 0.8396 | 0.8072 | 0.8847 |
No log | 88.0 | 176 | 0.7344 | 0.7745 | 0.8449 | 0.8082 | 0.8824 |
No log | 89.0 | 178 | 0.7373 | 0.7745 | 0.8449 | 0.8082 | 0.8824 |
No log | 90.0 | 180 | 0.7391 | 0.7707 | 0.8449 | 0.8061 | 0.8832 |
No log | 91.0 | 182 | 0.7403 | 0.7745 | 0.8449 | 0.8082 | 0.8824 |
No log | 92.0 | 184 | 0.7412 | 0.7745 | 0.8449 | 0.8082 | 0.8832 |
No log | 93.0 | 186 | 0.7417 | 0.7707 | 0.8449 | 0.8061 | 0.8832 |
No log | 94.0 | 188 | 0.7402 | 0.7745 | 0.8449 | 0.8082 | 0.8839 |
No log | 95.0 | 190 | 0.7389 | 0.7745 | 0.8449 | 0.8082 | 0.8847 |
No log | 96.0 | 192 | 0.7381 | 0.7696 | 0.8396 | 0.8031 | 0.8839 |
No log | 97.0 | 194 | 0.7377 | 0.7696 | 0.8396 | 0.8031 | 0.8847 |
No log | 98.0 | 196 | 0.7374 | 0.7696 | 0.8396 | 0.8031 | 0.8847 |
No log | 99.0 | 198 | 0.7372 | 0.7696 | 0.8396 | 0.8031 | 0.8847 |
No log | 100.0 | 200 | 0.7372 | 0.7696 | 0.8396 | 0.8031 | 0.8847 |
Framework Versions
- Transformers 4.18.0
- Pytorch 1.11.0+cu113
- Datasets 2.1.0
- Tokenizers 0.12.1
đ License
This project is licensed under the Apache - 2.0 license.
Indonesian Roberta Base Posp Tagger
MIT
This is a POS tagging model fine-tuned based on the Indonesian RoBERTa model, trained on the indonlu dataset for Indonesian text POS tagging tasks.
Sequence Labeling
Transformers Other

I
w11wo
2.2M
7
Bert Base NER
MIT
BERT fine-tuned named entity recognition model capable of identifying four entity types: Location (LOC), Organization (ORG), Person (PER), and Miscellaneous (MISC)
Sequence Labeling English
B
dslim
1.8M
592
Deid Roberta I2b2
MIT
This model is a sequence labeling model fine-tuned on RoBERTa, designed to identify and remove Protected Health Information (PHI/PII) from medical records.
Sequence Labeling
Transformers Supports Multiple Languages

D
obi
1.1M
33
Ner English Fast
Flair's built-in fast English 4-class named entity recognition model, based on Flair embeddings and LSTM-CRF architecture, achieving an F1 score of 92.92 on the CoNLL-03 dataset.
Sequence Labeling
PyTorch English
N
flair
978.01k
24
French Camembert Postag Model
French POS tagging model based on Camembert-base, trained using the free-french-treebank dataset
Sequence Labeling
Transformers French

F
gilf
950.03k
9
Xlm Roberta Large Ner Spanish
A Spanish named entity recognition model fine-tuned based on the XLM-Roberta-large architecture, with excellent performance on the CoNLL-2002 dataset.
Sequence Labeling
Transformers Spanish

X
MMG
767.35k
29
Nusabert Ner V1.3
MIT
Named entity recognition model fine-tuned on Indonesian NER tasks based on NusaBert-v1.3
Sequence Labeling
Transformers Other

N
cahya
759.09k
3
Ner English Large
Flair framework's built-in large English NER model for 4 entity types, utilizing document-level XLM-R embeddings and FLERT technique, achieving an F1 score of 94.36 on the CoNLL-03 dataset.
Sequence Labeling
PyTorch English
N
flair
749.04k
44
Punctuate All
MIT
A multilingual punctuation prediction model fine-tuned based on xlm-roberta-base, supporting automatic punctuation completion for 12 European languages
Sequence Labeling
Transformers

P
kredor
728.70k
20
Xlm Roberta Ner Japanese
MIT
Japanese named entity recognition model fine-tuned based on xlm-roberta-base
Sequence Labeling
Transformers Supports Multiple Languages

X
tsmatz
630.71k
25
Featured Recommended AI Models
Š 2025AIbase