đ segformer-b0-finetuned-segments-sidewalk-oct-22
This model is a fine - tuned version of [nvidia/mit - b0](https://huggingface.co/nvidia/mit - b0) on the vimassaru/teethsegmentation dataset, aiming to solve image segmentation tasks in the field of odontology using X - ray images.
đ Quick Start
This fine - tuned model segformer - b0 - finetuned - segments - sidewalk - oct - 22
is based on the pre - trained model [nvidia/mit - b0](https://huggingface.co/nvidia/mit - b0) and trained on the vimassaru/teethsegmentation dataset. It can be used for image segmentation tasks in the field of odontology, especially for X - ray images.
Evaluation Results
It achieves the following results on the evaluation set:
- Loss: 0.1880
- Mean Iou: 0.7311
- Mean Accuracy: 0.8106
- Overall Accuracy: 0.8184
- Accuracy Background: nan
- Accuracy 11: 0.8456
- Accuracy 12: 0.8351
- Accuracy 13: 0.8619
- Accuracy 14: 0.8112
- Accuracy 15: 0.8087
- Accuracy 16: 0.9141
- Accuracy 17: 0.8742
- Accuracy 18: 0.7394
- Accuracy 21: 0.8758
- Accuracy 22: 0.8579
- Accuracy 23: 0.8480
- Accuracy 24: 0.7169
- Accuracy 25: 0.8273
- Accuracy 26: 0.8481
- Accuracy 27: 0.8284
- Accuracy 28: 0.7298
- Accuracy 31: 0.7495
- Accuracy 32: 0.7987
- Accuracy 33: 0.8661
- Accuracy 34: 0.8392
- Accuracy 35: 0.7596
- Accuracy 36: 0.7482
- Accuracy 37: 0.8109
- Accuracy 38: 0.7016
- Accuracy 41: 0.7217
- Accuracy 42: 0.7480
- Accuracy 43: 0.8447
- Accuracy 44: 0.7868
- Accuracy 45: 0.8250
- Accuracy 46: 0.8762
- Accuracy 47: 0.8519
- Accuracy 48: 0.7878
- Iou Background: 0.0
- Iou 11: 0.8226
- Iou 12: 0.8155
- Iou 13: 0.8048
- Iou 14: 0.7807
- Iou 15: 0.7909
- Iou 16: 0.8609
- Iou 17: 0.8145
- Iou 18: 0.6999
- Iou 21: 0.8266
- Iou 22: 0.8160
- Iou 23: 0.8000
- Iou 24: 0.6900
- Iou 25: 0.7760
- Iou 26: 0.8065
- Iou 27: 0.7338
- Iou 28: 0.6771
- Iou 31: 0.6604
- Iou 32: 0.7394
- Iou 33: 0.7977
- Iou 34: 0.7577
- Iou 35: 0.6944
- Iou 36: 0.6774
- Iou 37: 0.7224
- Iou 38: 0.6099
- Iou 41: 0.6166
- Iou 42: 0.6741
- Iou 43: 0.7706
- Iou 44: 0.7386
- Iou 45: 0.7555
- Iou 46: 0.8271
- Iou 47: 0.8210
- Iou 48: 0.7466
đ§ Technical Details
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0006
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- num_epochs: 200
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Mean Iou |
Mean Accuracy |
Overall Accuracy |
Accuracy Background |
Accuracy 11 |
Accuracy 12 |
Accuracy 13 |
Accuracy 14 |
Accuracy 15 |
Accuracy 16 |
Accuracy 17 |
Accuracy 18 |
Accuracy 21 |
Accuracy 22 |
Accuracy 23 |
Accuracy 24 |
Accuracy 25 |
Accuracy 26 |
Accuracy 27 |
Accuracy 28 |
Accuracy 31 |
Accuracy 32 |
Accuracy 33 |
Accuracy 34 |
Accuracy 35 |
Accuracy 36 |
Accuracy 37 |
Accuracy 38 |
Accuracy 41 |
Accuracy 42 |
Accuracy 43 |
Accuracy 44 |
Accuracy 45 |
Accuracy 46 |
Accuracy 47 |
Accuracy 48 |
Iou Background |
Iou 11 |
Iou 12 |
Iou 13 |
Iou 14 |
Iou 15 |
Iou 16 |
Iou 17 |
Iou 18 |
Iou 21 |
Iou 22 |
Iou 23 |
Iou 24 |
Iou 25 |
Iou 26 |
Iou 27 |
Iou 28 |
Iou 31 |
Iou 32 |
Iou 33 |
Iou 34 |
Iou 35 |
Iou 36 |
Iou 37 |
Iou 38 |
Iou 41 |
Iou 42 |
Iou 43 |
Iou 44 |
Iou 45 |
Iou 46 |
Iou 47 |
Iou 48 |
1.5068 |
2.0 |
20 |
1.5257 |
0.0331 |
0.0872 |
0.1082 |
nan |
0.0 |
0.0 |
0.0325 |
0.0 |
0.0 |
0.3861 |
0.7782 |
0.0 |
0.5551 |
0.0 |
0.0003 |
0.0 |
0.0 |
0.0025 |
0.5879 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.1233 |
0.1058 |
0.0 |
0.0 |
0.0 |
0.0001 |
0.0 |
0.0 |
0.0 |
0.2200 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0298 |
0.0 |
0.0 |
0.2256 |
0.2666 |
0.0 |
0.0733 |
0.0 |
0.0003 |
0.0 |
0.0 |
0.0025 |
0.1063 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0983 |
0.0956 |
0.0 |
0.0 |
0.0 |
0.0001 |
0.0 |
0.0 |
0.0 |
0.1930 |
0.0 |
0.6989 |
4.0 |
40 |
0.6959 |
0.0319 |
0.0406 |
0.0525 |
nan |
0.0032 |
0.0 |
0.0 |
0.0 |
0.0 |
0.1235 |
0.5046 |
0.0 |
0.0025 |
0.0 |
0.0023 |
0.0 |
0.0 |
0.0821 |
0.2861 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0411 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.2543 |
0.0 |
0.0 |
0.0032 |
0.0 |
0.0 |
0.0 |
0.0 |
0.1124 |
0.3456 |
0.0 |
0.0024 |
0.0 |
0.0023 |
0.0 |
0.0 |
0.0706 |
0.2383 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0395 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
0.2387 |
0.0 |
0.5093 |
6.0 |
60 |
0.4954 |
0.1659 |
0.2494 |
0.3064 |
nan |
0.4165 |
0.0 |
0.5606 |
0.2388 |
0.2757 |
0.7236 |
0.7757 |
0.0 |
0.6343 |
0.0 |
0.5877 |
0.0009 |
0.0374 |
0.7495 |
0.7771 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0323 |
0.0 |
0.0006 |
0.8019 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0436 |
0.0 |
0.0556 |
0.7505 |
0.5182 |
0.0 |
0.2822 |
0.0 |
0.3631 |
0.2026 |
0.2067 |
0.4694 |
0.5250 |
0.0 |
0.3620 |
0.0 |
0.4125 |
0.0009 |
0.0358 |
0.4261 |
0.5167 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0299 |
0.0 |
0.0006 |
0.5306 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0423 |
0.0 |
0.0555 |
0.5479 |
0.4643 |
0.3942 |
8.0 |
80 |
0.3680 |
0.3501 |
0.4658 |
0.5243 |
nan |
0.5738 |
0.1433 |
0.7262 |
0.5454 |
0.5830 |
0.7900 |
0.8112 |
0.0005 |
0.7442 |
0.0533 |
0.8308 |
0.3071 |
0.5064 |
0.8316 |
0.7153 |
0.0 |
0.0000 |
0.0743 |
0.6079 |
0.4475 |
0.1634 |
0.6588 |
0.7590 |
0.2711 |
0.0 |
0.3319 |
0.4395 |
0.6430 |
0.2279 |
0.7393 |
0.6828 |
0.6962 |
0.0 |
0.4491 |
0.1389 |
0.5260 |
0.4596 |
0.5023 |
0.6223 |
0.5467 |
0.0005 |
0.5418 |
0.0531 |
0.5015 |
0.2715 |
0.4175 |
0.6065 |
0.4828 |
0.0 |
0.0000 |
0.0713 |
0.3230 |
0.3481 |
0.1586 |
0.5433 |
0.5740 |
0.2533 |
0.0 |
0.2423 |
0.3966 |
0.4280 |
0.2200 |
0.6234 |
0.6177 |
0.6324 |
0.2784 |
10.0 |
100 |
0.2889 |
0.4636 |
0.5684 |
0.6056 |
nan |
0.7515 |
0.3831 |
0.6687 |
0.4978 |
0.5445 |
0.7171 |
0.8213 |
0.3704 |
0.8125 |
0.4238 |
0.7137 |
0.3553 |
0.3746 |
0.6771 |
0.7124 |
0.3304 |
0.2737 |
0.3675 |
0.6784 |
0.5418 |
0.5543 |
0.6343 |
0.7606 |
0.5879 |
0.0628 |
0.4605 |
0.6789 |
0.6545 |
0.4182 |
0.8261 |
0.8022 |
0.7318 |
0.0 |
0.5755 |
0.3346 |
0.5240 |
0.4448 |
0.5151 |
0.6337 |
0.6315 |
0.3573 |
0.5967 |
0.3743 |
0.5367 |
0.2860 |
0.3354 |
0.6064 |
0.5735 |
0.3130 |
0.2389 |
0.3153 |
0.5370 |
0.4762 |
0.4740 |
0.5568 |
0.6436 |
0.4934 |
0.0622 |
0.3368 |
0.5307 |
0.4908 |
0.4042 |
0.7224 |
0.7127 |
0.6656 |
0.2464 |
12.0 |
120 |
0.2468 |
0.5400 |
0.6505 |
0.6768 |
nan |
0.7948 |
0.5458 |
0.7406 |
0.5603 |
0.6560 |
0.8367 |
0.8329 |
0.4258 |
0.8497 |
0.5678 |
0.7320 |
0.5367 |
0.6036 |
0.7622 |
0.7365 |
0.5251 |
0.4648 |
0.6275 |
0.7353 |
0.5512 |
0.6709 |
0.6991 |
0.7896 |
0.6117 |
0.2989 |
0.5592 |
0.6312 |
0.5789 |
0.6009 |
0.8589 |
0.7659 |
0.6656 |
0.0 |
0.6477 |
0.4822 |
0.5850 |
0.4886 |
0.6094 |
0.7392 |
0.6728 |
0.4202 |
0.6684 |
0.4945 |
0.6069 |
0.4492 |
0.4973 |
0.6686 |
0.6208 |
0.4915 |
0.3555 |
0.4906 |
0.5972 |
0.4926 |
0.5760 |
0.6233 |
0.6776 |
0.5277 |
0.2650 |
0.4418 |
0.5141 |
0.4982 |
0.5567 |
0.7441 |
0.6964 |
0.6216 |
0.1842 |
14.0 |
140 |
0.2142 |
0.6089 |
0.7198 |
0.7422 |
nan |
0.7984 |
0.6803 |
0.7803 |
0.6313 |
0.7672 |
0.8662 |
0.8918 |
0.6084 |
0.8619 |
0.7294 |
0.8191 |
0.6076 |
0.7007 |
0.8122 |
0.8322 |
0.3832 |
0.5565 |
0.6651 |
0.8472 |
0.7226 |
0.7371 |
0.7750 |
0.7743 |
0.5362 |
0.5420 |
0.6400 |
0.7419 |
0.7320 |
0.6402 |
0.8401 |
0.8309 |
0.6816 |
0.0 |
0.7143 |
0.6368 |
0.7007 |
0.5825 |
0.6365 |
0.7816 |
0.7521 |
0.5826 |
0.7414 |
0.6622 |
0.7151 |
0.5453 |
0.5588 |
0.7077 |
0.6385 |
0.3652 |
0.4556 |
0.5434 |
0.7138 |
0.6374 |
0.6336 |
0.6464 |
0.6580 |
0.4844 |
0.4542 |
0.5349 |
0.6565 |
0.6279 |
0.5893 |
0.7605 |
0.7330 |
0.6423 |
0.1488 |
16.0 |
160 |
0.1933 |
0.6322 |
0.7373 |
0.7542 |
nan |
0.7908 |
0.6508 |
0.7861 |
0.7352 |
0.7521 |
0.8892 |
0.8725 |
0.6348 |
0.8581 |
0.6927 |
0.8070 |
0.6642 |
0.7642 |
0.8278 |
0.8363 |
0.5789 |
0.6088 |
0.6556 |
0.8139 |
0.6966 |
0.6829 |
0.7853 |
0.7453 |
0.6678 |
0.5757 |
0.6036 |
0.7537 |
0.7381 |
0.7114 |
0.8446 |
0.8103 |
0.7584 |
0.0 |
0.6990 |
0.6113 |
0.7069 |
0.6612 |
0.7097 |
0.7968 |
0.7383 |
0.6079 |
0.7618 |
0.6629 |
0.7209 |
0.5896 |
0.6152 |
0.7371 |
0.6847 |
0.5395 |
0.5130 |
0.5606 |
0.6936 |
0.6153 |
0.5981 |
0.6347 |
0.6535 |
0.5868 |
0.4842 |
0.5324 |
0.6701 |
0.6410 |
0.6346 |
0.7595 |
0.7362 |
0.7052 |
0.1416 |
18.0 |
180 |
0.1841 |
0.6533 |
0.7530 |
0.7611 |
nan |
0.8485 |
0.7657 |
0.8163 |
0.6475 |
0.6706 |
0.7889 |
0.8295 |
0.8478 |
0.8320 |
0.7455 |
0.7939 |
0.5882 |
0.6691 |
0.7761 |
0.7446 |
0.7167 |
0.5879 |
0.6785 |
0.7867 |
0.7367 |
0.7456 |
0.7522 |
0.7657 |
0.7979 |
0.6336 |
0.7404 |
0.8204 |
0.7517 |
0.6922 |
0.8420 |
0.8470 |
0.8352 |
0.0 |
0.7802 |
0.7301 |
0.7232 |
0.6005 |
0.6380 |
0.7526 |
0.7347 |
0.6939 |
0.7603 |
0.6879 |
0.6913 |
0.5221 |
0.5773 |
0.7374 |
0.6773 |
0.6223 |
0.5234 |
0.5861 |
0.7062 |
0.6492 |
0.6365 |
0.6644 |
0.6850 |
0.6211 |
0.5522 |
0.6502 |
0.7557 |
0.6860 |
0.6374 |
0.7636 |
0.7728 |
0.7395 |
0.1346 |
20.0 |
200 |
0.1742 |
0.6462 |
0.7375 |
0.7528 |
nan |
0.8037 |
0.7862 |
0.7855 |
0.7327 |
0.7617 |
0.8752 |
0.8049 |
0.5045 |
0.8113 |
0.7420 |
0.8055 |
0.5818 |
0.6962 |
0.7828 |
0.8136 |
0.5866 |
0.5741 |
0.7100 |
0.8119 |
0.7525 |
0.6603 |
0.7224 |
0.8027 |
0.6645 |
0.6230 |
0.7702 |
0.8144 |
0.6772 |
0.7818 |
0.8084 |
0.8461 |
0.7052 |
0.0 |
0.7651 |
0.7384 |
0.7206 |
0.6790 |
0.7135 |
0.7803 |
0.6876 |
0.4954 |
0.7515 |
0.6922 |
0.7354 |
0.5483 |
0.6050 |
0.7345 |
0.6809 |
0.5512 |
0.5261 |
0.6238 |
0.7292 |
0.6495 |
0.5871 |
0.6396 |
0.7011 |
0.5818 |
0.5478 |
0.6583 |
0.7103 |
0.6303 |
0.6758 |
0.7516 |
0.7592 |
0.6734 |
0.1042 |
22.0 |
220 |
0.1637 |
0.6 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
đ License
This model is released under the other
license.