đ bert_bilstm_crf-ner-weibo
This model is a fine - tuned version of google-bert/bert-base-chinese on an unknown dataset. It offers high - performance Named Entity Recognition (NER) capabilities, achieving excellent results in precision, recall, F1 score, and accuracy on the evaluation set.
đ Quick Start
This section provides a high - level overview of the model. For detailed usage, please refer to the official documentation of the transformers
library.
⨠Features
- Fine - tuned Model: Based on the pre - trained
google-bert/bert-base-chinese
model, it is fine - tuned on a specific dataset to adapt to NER tasks.
- High Performance: Achieves good performance in multiple evaluation metrics such as precision, recall, F1 score, and accuracy.
đ Documentation
Model description
This model is a fine - tuned version of google-bert/bert-base-chinese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1945
- Precision: 0.6524
- Recall: 0.7429
- F1: 0.6947
- Accuracy: 0.9703
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e - 05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon = 1e - 08 and optimizer_args = No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Precision |
Recall |
F1 |
Accuracy |
0.4272 |
1.0 |
22 |
0.3531 |
0.0 |
0.0 |
0.0 |
0.9330 |
0.2529 |
2.0 |
44 |
0.1587 |
0.4922 |
0.4884 |
0.4903 |
0.9613 |
0.1472 |
3.0 |
66 |
0.1171 |
0.5524 |
0.6915 |
0.6142 |
0.9681 |
0.0977 |
4.0 |
88 |
0.1057 |
0.5866 |
0.6967 |
0.6369 |
0.9714 |
0.065 |
5.0 |
110 |
0.1035 |
0.6336 |
0.7069 |
0.6683 |
0.9715 |
0.0538 |
6.0 |
132 |
0.1149 |
0.6307 |
0.7069 |
0.6667 |
0.9699 |
0.0413 |
7.0 |
154 |
0.1057 |
0.6315 |
0.7224 |
0.6739 |
0.9724 |
0.0344 |
8.0 |
176 |
0.1236 |
0.5979 |
0.7455 |
0.6636 |
0.9693 |
0.0296 |
9.0 |
198 |
0.1271 |
0.5958 |
0.7352 |
0.6582 |
0.9680 |
0.0297 |
10.0 |
220 |
0.1257 |
0.6442 |
0.6889 |
0.6658 |
0.9702 |
0.0212 |
11.0 |
242 |
0.1440 |
0.6037 |
0.7481 |
0.6682 |
0.9664 |
0.0208 |
12.0 |
264 |
0.1368 |
0.6284 |
0.7044 |
0.6642 |
0.9683 |
0.0165 |
13.0 |
286 |
0.1337 |
0.6545 |
0.7404 |
0.6948 |
0.9698 |
0.0164 |
14.0 |
308 |
0.1388 |
0.6514 |
0.7301 |
0.6885 |
0.9700 |
0.014 |
15.0 |
330 |
0.1403 |
0.6690 |
0.7275 |
0.6970 |
0.9701 |
0.0109 |
16.0 |
352 |
0.1467 |
0.6448 |
0.7326 |
0.6859 |
0.9694 |
0.0108 |
17.0 |
374 |
0.1488 |
0.6081 |
0.7301 |
0.6636 |
0.9670 |
0.0106 |
18.0 |
396 |
0.1564 |
0.6572 |
0.7147 |
0.6847 |
0.9687 |
0.0105 |
19.0 |
418 |
0.1620 |
0.6667 |
0.7147 |
0.6898 |
0.9691 |
0.01 |
20.0 |
440 |
0.1638 |
0.7046 |
0.6684 |
0.6860 |
0.9705 |
0.0106 |
21.0 |
462 |
0.1542 |
0.6709 |
0.6761 |
0.6735 |
0.9692 |
0.0092 |
22.0 |
484 |
0.1487 |
0.6683 |
0.7198 |
0.6931 |
0.9694 |
0.011 |
23.0 |
506 |
0.1502 |
0.6396 |
0.7301 |
0.6819 |
0.9691 |
0.0068 |
24.0 |
528 |
0.1534 |
0.6801 |
0.7378 |
0.7078 |
0.9705 |
0.0077 |
25.0 |
550 |
0.1600 |
0.6793 |
0.7352 |
0.7062 |
0.9710 |
0.0071 |
26.0 |
572 |
0.1644 |
0.6386 |
0.7404 |
0.6857 |
0.9676 |
0.0062 |
27.0 |
594 |
0.1714 |
0.6430 |
0.7224 |
0.6804 |
0.9688 |
0.006 |
28.0 |
616 |
0.1649 |
0.6461 |
0.7275 |
0.6844 |
0.9694 |
0.0072 |
29.0 |
638 |
0.1631 |
0.6643 |
0.7326 |
0.6968 |
0.9695 |
0.0122 |
30.0 |
660 |
0.1802 |
0.6054 |
0.7455 |
0.6682 |
0.9676 |
0.0062 |
31.0 |
682 |
0.1829 |
0.6154 |
0.7404 |
0.6721 |
0.9676 |
0.0075 |
32.0 |
704 |
0.1674 |
0.6313 |
0.7352 |
0.6793 |
0.9691 |
0.0048 |
33.0 |
726 |
0.1664 |
0.6422 |
0.7429 |
0.6889 |
0.9692 |
0.0045 |
34.0 |
748 |
0.1724 |
0.6374 |
0.7455 |
0.6872 |
0.9697 |
0.0055 |
35.0 |
770 |
0.1714 |
0.6636 |
0.7301 |
0.6952 |
0.9700 |
0.0071 |
36.0 |
792 |
0.1673 |
0.6316 |
0.7404 |
0.6817 |
0.9692 |
0.0039 |
37.0 |
814 |
0.1635 |
0.6620 |
0.7352 |
0.6967 |
0.9709 |
0.0036 |
38.0 |
836 |
0.1727 |
0.6584 |
0.7532 |
0.7026 |
0.9710 |
0.0051 |
39.0 |
858 |
0.1735 |
0.6509 |
0.7429 |
0.6939 |
0.9708 |
0.0033 |
40.0 |
880 |
0.1758 |
0.6949 |
0.7378 |
0.7157 |
0.9718 |
0.0045 |
41.0 |
902 |
0.1812 |
0.6309 |
0.7558 |
0.6877 |
0.9698 |
0.0035 |
42.0 |
924 |
0.1791 |
0.6729 |
0.7404 |
0.7050 |
0.9709 |
0.0043 |
43.0 |
946 |
0.1923 |
0.6532 |
0.7455 |
0.6963 |
0.9697 |
0.0045 |
44.0 |
968 |
0.1815 |
0.6492 |
0.7326 |
0.6884 |
0.9696 |
0.0037 |
45.0 |
990 |
0.1830 |
0.6493 |
0.7378 |
0.6907 |
0.9700 |
0.0045 |
46.0 |
1012 |
0.1809 |
0.6493 |
0.7378 |
0.6907 |
0.9700 |
0.0039 |
47.0 |
1034 |
0.1811 |
0.6545 |
0.7404 |
0.6948 |
0.9701 |
0.0046 |
48.0 |
1056 |
0.1740 |
0.6659 |
0.7172 |
0.6906 |
0.9708 |
0.0039 |
49.0 |
1078 |
0.1827 |
0.6318 |
0.7455 |
0.6840 |
0.9694 |
0.0036 |
50.0 |
1100 |
0.1762 |
0.6443 |
0.7404 |
0.6890 |
0.9698 |
0.0046 |
51.0 |
1122 |
0.1752 |
0.6538 |
0.7378 |
0.6932 |
0.9702 |
0.0036 |
52.0 |
1144 |
0.1856 |
0.6344 |
0.7404 |
0.6833 |
0.9692 |
0.0036 |
53.0 |
1166 |
0.1870 |
0.6350 |
0.7378 |
0.6825 |
0.9693 |
0.0049 |
54.0 |
1188 |
0.1840 |
0.6723 |
0.7121 |
0.6916 |
0.9699 |
0.0042 |
55.0 |
1210 |
0.1927 |
0.6220 |
0.7404 |
0.6761 |
0.9687 |
0.0039 |
56.0 |
1232 |
0.1854 |
0.6545 |
0.7352 |
0.6925 |
0.9704 |
0.0042 |
57.0 |
1254 |
0.1900 |
0.6523 |
0.7378 |
0.6924 |
0.9700 |
0.0028 |
58.0 |
1276 |
0.1894 |
0.6486 |
0.7404 |
0.6915 |
0.9697 |
0.0049 |
59.0 |
1298 |
0.1904 |
0.6366 |
0.7429 |
0.6856 |
0.9695 |
0.0031 |
60.0 |
1320 |
0.1844 |
0.6492 |
0.7326 |
0.6884 |
0.9698 |
0.0045 |
61.0 |
1342 |
0.1866 |
0.6429 |
0.7404 |
0.6882 |
0.9696 |
0.004 |
62.0 |
1364 |
0.1888 |
0.625 |
0.7326 |
0.6746 |
0.9686 |
0.0031 |
63.0 |
1386 |
0.1922 |
0.6875 |
0.7352 |
0.7106 |
0.9710 |
0.0044 |
64.0 |
1408 |
0.1918 |
0.6722 |
0.7326 |
0.7011 |
0.9706 |
0.0046 |
65.0 |
1430 |
0.1987 |
0.6475 |
0.7506 |
0.6952 |
0.9685 |
0.0044 |
66.0 |
1452 |
0.1868 |
0.6388 |
0.7455 |
0.6880 |
0.9698 |
0.0042 |
67.0 |
1474 |
0.1920 |
0.6356 |
0.7532 |
0.6894 |
0.9695 |
0.0038 |
68.0 |
1496 |
0.1852 |
0.6606 |
0.7506 |
0.7028 |
0.9705 |
0.0033 |
69.0 |
1518 |
0.1843 |
0.6476 |
0.7558 |
0.6975 |
0.9700 |
0.0034 |
70.0 |
1540 |
0.1797 |
0.6532 |
0.7506 |
0.6986 |
0.9707 |
0.0042 |
71.0 |
1562 |
0.1820 |
0.6332 |
0.7455 |
0.6848 |
0.9699 |
0.0033 |
72.0 |
1584 |
0.1874 |
0.6482 |
0.7532 |
0.6968 |
0.9704 |
0.0039 |
73.0 |
1606 |
0.1878 |
0.6636 |
0.7506 |
0.7045 |
0.9708 |
0.003 |
74.0 |
1628 |
0.1857 |
0.6553 |
0.7429 |
0.6964 |
0.9712 |
0.0038 |
75.0 |
1650 |
0.1889 |
0.6606 |
0.7404 |
0.6982 |
0.9709 |
0.004 |
76.0 |
1672 |
0.1880 |
0.6539 |
0.7481 |
0.6978 |
0.9709 |
0.0032 |
77.0 |
1694 |
0.1875 |
0.6590 |
0.7404 |
0.6973 |
0.9706 |
0.0034 |
78.0 |
1716 |
0.1868 |
0.6532 |
0.7455 |
0.6963 |
0.9710 |
0.0029 |
79.0 |
1738 |
0.1899 |
0.6545 |
0.7404 |
0.6948 |
0.9705 |
0.0032 |
80.0 |
1760 |
0.1899 |
0.6628 |
0.7429 |
0.7006 |
0.9709 |
0.0037 |
81.0 |
1782 |
0.1928 |
0.6545 |
0.7404 |
0.6948 |
0.9705 |
0.0039 |
82.0 |
1804 |
0.1916 |
0.6560 |
0.7404 |
0.6957 |
0.9705 |
0.0034 |
83.0 |
1826 |
0.1926 |
0.6560 |
0.7352 |
0.6933 |
0.9705 |
0.0032 |
84.0 |
1848 |
0.1931 |
0.6621 |
0.7455 |
0.7013 |
0.9709 |
0.0048 |
85.0 |
1870 |
0.1925 |
0.6659 |
0.7481 |
0.7046 |
0.9712 |
0.0039 |
86.0 |
1892 |
0.1903 |
0.6690 |
0.7326 |
0.6994 |
0.9709 |
0.0039 |
87.0 |
1914 |
0.1948 |
0.6538 |
0.7429 |
0.6955 |
0.9709 |
0.0032 |
88.0 |
1936 |
0.1949 |
0.6682 |
0.7558 |
0.7093 |
0.9710 |
0.003 |
89.0 |
1958 |
0.1948 |
0.6697 |
0.7609 |
0.7124 |
0.9710 |
0.0027 |
90.0 |
1980 |
0.1927 |
0.6489 |
0.7506 |
0.6961 |
0.9705 |
0.0029 |
91.0 |
2002 |
0.1931 |
0.6496 |
0.7481 |
0.6953 |
0.9706 |
0.003 |
92.0 |
2024 |
0.1932 |
0.6532 |
0.7455 |
0.6963 |
0.9712 |
0.0029 |
93.0 |
2046 |
0.1928 |
0.6539 |
0.7481 |
0.6978 |
0.9712 |
0.0036 |
94.0 |
2068 |
0.1935 |
0.6503 |
0.7506 |
0.6969 |
0.9710 |
0.0034 |
95.0 |
2090 |
0.1941 |
0.6607 |
0.7558 |
0.7050 |
0.9714 |
0.0035 |
96.0 |
2112 |
0.1940 |
0.6621 |
0.7455 |
0.7013 |
0.9711 |
0.0028 |
97.0 |
2134 |
0.1940 |
0.6553 |
0.7429 |
0.6964 |
0.9707 |
0.0032 |
98.0 |
2156 |
0.1944 |
0.6509 |
0.7429 |
0.6939 |
0.9704 |
0.0028 |
99.0 |
2178 |
0.1943 |
0.6509 |
0.7429 |
0.6939 |
0.9705 |
0.0021 |
100.0 |
2200 |
0.1945 |
0.6524 |
0.7429 |
0.6947 |
0.9703 |
Framework versions
- Transformers 4.46.1
- Pytorch 1.13.1+cu117
- Datasets 3.1.0
- Tokenizers 0.20.2
đ§ Technical Details
Model Information
Property |
Details |
Model Type |
Fine - tuned BERT - BiLSTM - CRF for NER |
Training Data |
Unknown |