đ swin-tiny-patch4-window7-224-finetuned-woody_LeftGR_130epochs
This model is a fine - tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.3377
- Accuracy: 0.9047
đ Quick Start
This model is a fine - tuned image classification model. It can be used for various image - related tasks after proper adaptation.
đ Documentation
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e - 05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 130
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
0.6614 |
1.0 |
61 |
0.6404 |
0.6521 |
0.5982 |
2.0 |
122 |
0.5548 |
0.7107 |
0.579 |
3.0 |
183 |
0.5390 |
0.7141 |
0.5621 |
4.0 |
244 |
0.4920 |
0.7623 |
0.5567 |
5.0 |
305 |
0.5375 |
0.7313 |
0.5271 |
6.0 |
366 |
0.5542 |
0.7405 |
0.5312 |
7.0 |
427 |
0.4573 |
0.7876 |
0.5477 |
8.0 |
488 |
0.4540 |
0.7784 |
0.5554 |
9.0 |
549 |
0.4932 |
0.7635 |
0.5247 |
10.0 |
610 |
0.4407 |
0.7968 |
0.5239 |
11.0 |
671 |
0.4479 |
0.7842 |
0.5294 |
12.0 |
732 |
0.4509 |
0.7910 |
0.531 |
13.0 |
793 |
0.4419 |
0.7933 |
0.5493 |
14.0 |
854 |
0.4646 |
0.7784 |
0.4934 |
15.0 |
915 |
0.4310 |
0.7968 |
0.4965 |
16.0 |
976 |
0.4449 |
0.7876 |
0.4946 |
17.0 |
1037 |
0.4342 |
0.8129 |
0.4716 |
18.0 |
1098 |
0.4129 |
0.8140 |
0.4679 |
19.0 |
1159 |
0.4290 |
0.8002 |
0.4799 |
20.0 |
1220 |
0.4356 |
0.7842 |
0.4744 |
21.0 |
1281 |
0.4042 |
0.8094 |
0.4512 |
22.0 |
1342 |
0.3953 |
0.8117 |
0.4633 |
23.0 |
1403 |
0.4157 |
0.7956 |
0.4528 |
24.0 |
1464 |
0.3920 |
0.8094 |
0.4427 |
25.0 |
1525 |
0.3930 |
0.8220 |
0.4238 |
26.0 |
1586 |
0.3891 |
0.8140 |
0.4257 |
27.0 |
1647 |
0.3700 |
0.8255 |
0.4102 |
28.0 |
1708 |
0.4122 |
0.7968 |
0.4505 |
29.0 |
1769 |
0.4210 |
0.7945 |
0.3973 |
30.0 |
1830 |
0.3923 |
0.8197 |
0.3824 |
31.0 |
1891 |
0.3908 |
0.8473 |
0.3887 |
32.0 |
1952 |
0.3897 |
0.8312 |
0.3723 |
33.0 |
2013 |
0.3747 |
0.8381 |
0.3608 |
34.0 |
2074 |
0.3706 |
0.8301 |
0.3718 |
35.0 |
2135 |
0.3937 |
0.8255 |
0.3692 |
36.0 |
2196 |
0.3984 |
0.8037 |
0.3533 |
37.0 |
2257 |
0.3792 |
0.8335 |
0.3625 |
38.0 |
2318 |
0.4070 |
0.8163 |
0.3633 |
39.0 |
2379 |
0.4130 |
0.8232 |
0.3602 |
40.0 |
2440 |
0.3996 |
0.8186 |
0.3557 |
41.0 |
2501 |
0.3756 |
0.8335 |
0.3373 |
42.0 |
2562 |
0.3914 |
0.8220 |
0.3102 |
43.0 |
2623 |
0.4165 |
0.8507 |
0.3135 |
44.0 |
2684 |
0.3852 |
0.8278 |
0.3286 |
45.0 |
2745 |
0.4164 |
0.8450 |
0.316 |
46.0 |
2806 |
0.3498 |
0.8496 |
0.2802 |
47.0 |
2867 |
0.3887 |
0.8462 |
0.3184 |
48.0 |
2928 |
0.3829 |
0.8576 |
0.2785 |
49.0 |
2989 |
0.3627 |
0.8485 |
0.2988 |
50.0 |
3050 |
0.3679 |
0.8370 |
0.267 |
51.0 |
3111 |
0.3528 |
0.8645 |
0.2907 |
52.0 |
3172 |
0.3538 |
0.8519 |
0.2857 |
53.0 |
3233 |
0.3593 |
0.8530 |
0.2651 |
54.0 |
3294 |
0.3732 |
0.8439 |
0.2447 |
55.0 |
3355 |
0.3441 |
0.8542 |
0.2542 |
56.0 |
3416 |
0.3897 |
0.8576 |
0.2634 |
57.0 |
3477 |
0.4082 |
0.8657 |
0.2505 |
58.0 |
3538 |
0.3416 |
0.8657 |
0.2555 |
59.0 |
3599 |
0.3725 |
0.8576 |
0.2466 |
60.0 |
3660 |
0.3496 |
0.8680 |
0.2585 |
61.0 |
3721 |
0.3214 |
0.8783 |
0.235 |
62.0 |
3782 |
0.3584 |
0.8737 |
0.215 |
63.0 |
3843 |
0.3467 |
0.8657 |
0.236 |
64.0 |
3904 |
0.3471 |
0.8829 |
0.2211 |
65.0 |
3965 |
0.3318 |
0.8863 |
0.1989 |
66.0 |
4026 |
0.3645 |
0.8852 |
0.2133 |
67.0 |
4087 |
0.3456 |
0.8898 |
0.2169 |
68.0 |
4148 |
0.3287 |
0.8852 |
0.223 |
69.0 |
4209 |
0.3182 |
0.8921 |
0.2379 |
70.0 |
4270 |
0.3260 |
0.8840 |
0.2149 |
71.0 |
4331 |
0.3230 |
0.8886 |
0.2007 |
72.0 |
4392 |
0.3926 |
0.8760 |
0.2091 |
73.0 |
4453 |
0.4133 |
0.8783 |
0.2229 |
74.0 |
4514 |
0.3867 |
0.8772 |
0.1903 |
75.0 |
4575 |
0.3594 |
0.8840 |
0.2124 |
76.0 |
4636 |
0.3388 |
0.8875 |
0.1999 |
77.0 |
4697 |
0.3305 |
0.8875 |
0.2053 |
78.0 |
4758 |
0.4670 |
0.8840 |
0.1958 |
79.0 |
4819 |
0.3468 |
0.8909 |
0.1839 |
80.0 |
4880 |
0.3902 |
0.8886 |
0.1715 |
81.0 |
4941 |
0.3830 |
0.8875 |
0.1803 |
82.0 |
5002 |
0.3134 |
0.8967 |
0.1803 |
83.0 |
5063 |
0.3935 |
0.8909 |
0.1865 |
84.0 |
5124 |
0.3882 |
0.8863 |
0.1884 |
85.0 |
5185 |
0.3485 |
0.8990 |
0.1663 |
86.0 |
5246 |
0.3667 |
0.8944 |
0.1665 |
87.0 |
5307 |
0.3545 |
0.8932 |
0.1556 |
88.0 |
5368 |
0.3882 |
0.8944 |
0.18 |
89.0 |
5429 |
0.3751 |
0.8898 |
0.1974 |
90.0 |
5490 |
0.3979 |
0.8863 |
0.1622 |
91.0 |
5551 |
0.3623 |
0.8967 |
0.1657 |
92.0 |
5612 |
0.3855 |
0.8978 |
0.1672 |
93.0 |
5673 |
0.3722 |
0.8944 |
0.1807 |
94.0 |
5734 |
0.3994 |
0.8932 |
0.1419 |
95.0 |
5795 |
0.4017 |
0.8863 |
0.178 |
96.0 |
5856 |
0.4168 |
0.8886 |
0.1402 |
97.0 |
5917 |
0.3727 |
0.8944 |
0.1427 |
98.0 |
5978 |
0.3919 |
0.8967 |
0.1318 |
99.0 |
6039 |
0.3843 |
0.8955 |
0.1417 |
100.0 |
6100 |
0.4017 |
0.8898 |
0.1536 |
101.0 |
6161 |
0.3613 |
0.8955 |
0.1631 |
102.0 |
6222 |
0.3377 |
0.9047 |
0.1459 |
103.0 |
6283 |
0.3724 |
0.8967 |
0.1499 |
104.0 |
6344 |
0.3934 |
0.8955 |
0.1572 |
105.0 |
6405 |
0.3368 |
0.8967 |
0.1308 |
106.0 |
6466 |
0.3782 |
0.8990 |
0.1535 |
107.0 |
6527 |
0.3306 |
0.9024 |
0.125 |
108.0 |
6588 |
0.4076 |
0.8898 |
0.1339 |
109.0 |
6649 |
0.3628 |
0.8990 |
0.148 |
110.0 |
6710 |
0.3672 |
0.9013 |
0.1725 |
111.0 |
6771 |
0.4006 |
0.8909 |
0.1326 |
112.0 |
6832 |
0.4117 |
0.8921 |
0.1438 |
113.0 |
6893 |
0.3927 |
0.8978 |
0.1205 |
114.0 |
6954 |
0.3612 |
0.8990 |
0.1531 |
115.0 |
7015 |
0.3594 |
0.8932 |
0.1473 |
116.0 |
7076 |
0.4490 |
0.8875 |
0.1388 |
117.0 |
7137 |
0.3952 |
0.8921 |
0.136 |
118.0 |
7198 |
0.4098 |
0.8921 |
0.1579 |
119.0 |
7259 |
0.3595 |
0.9013 |
0.1359 |
120.0 |
7320 |
0.3970 |
0.8944 |
0.1314 |
121.0 |
7381 |
0.4092 |
0.8932 |
0.1337 |
122.0 |
7442 |
0.4192 |
0.8909 |
0.1538 |
123.0 |
7503 |
0.4154 |
0.8898 |
0.119 |
124.0 |
7564 |
0.4120 |
0.8909 |
0.1353 |
125.0 |
7625 |
0.4060 |
0.8921 |
0.1489 |
126.0 |
7686 |
0.4162 |
0.8909 |
0.1554 |
127.0 |
7747 |
0.4148 |
0.8944 |
0.1558 |
128.0 |
7808 |
0.4169 |
0.8944 |
0.1268 |
129.0 |
7869 |
0.4110 |
0.8955 |
0.1236 |
130.0 |
7930 |
0.4197 |
0.8944 |
Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1+cu113
- Datasets 2.6.1
- Tokenizers 0.13.1
đ License
This model is licensed under the Apache - 2.0 license.