đ hf_train_output
This model is a fine - tuned version of [google/vit - base - patch16 - 224 - in21k](https://huggingface.co/google/vit - base - patch16 - 224 - in21k) on the rock - glacier - dataset. It offers high - accuracy image classification capabilities, achieving excellent results on the evaluation set.
đ Quick Start
This model is already fine - tuned on the rock - glacier - dataset. You can directly use it for image classification tasks.
⨠Features
- High Accuracy: Achieves an accuracy of 0.9258 on the evaluation set.
- Fine - Tuned: Based on the pre - trained model [google/vit - base - patch16 - 224 - in21k], fine - tuned on the rock - glacier - dataset.
đ Documentation
Model Performance
This model achieves the following results on the evaluation set:
- Loss: 0.3894
- Accuracy: 0.9258
Model Index
Property |
Details |
Model Name |
hf_train_output |
Task |
Image Classification |
Dataset |
rock - glacier - dataset |
Metrics |
Accuracy: 0.9258241758241759 |
Training and Evaluation Data
The model is trained and evaluated on the rock - glacier - dataset.
Training Procedure
Training Hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e - 05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
Training Results
Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
0.5619 |
0.55 |
50 |
0.5432 |
0.7692 |
0.4582 |
1.1 |
100 |
0.4435 |
0.8352 |
0.3548 |
1.65 |
150 |
0.3739 |
0.8599 |
0.217 |
2.2 |
200 |
0.2913 |
0.9093 |
0.1709 |
2.75 |
250 |
0.2619 |
0.9148 |
0.0919 |
3.3 |
300 |
0.2475 |
0.9148 |
0.0652 |
3.85 |
350 |
0.3275 |
0.8901 |
0.0495 |
4.4 |
400 |
0.2515 |
0.9093 |
0.0321 |
4.95 |
450 |
0.2878 |
0.9066 |
0.0247 |
5.49 |
500 |
0.2612 |
0.9148 |
0.017 |
6.04 |
550 |
0.2687 |
0.9176 |
0.0131 |
6.59 |
600 |
0.3062 |
0.9093 |
0.0113 |
7.14 |
650 |
0.2587 |
0.9231 |
0.0099 |
7.69 |
700 |
0.2815 |
0.9203 |
0.009 |
8.24 |
750 |
0.2675 |
0.9286 |
0.0084 |
8.79 |
800 |
0.2711 |
0.9286 |
0.0077 |
9.34 |
850 |
0.2663 |
0.9313 |
0.0073 |
9.89 |
900 |
0.3003 |
0.9258 |
0.0069 |
10.44 |
950 |
0.2758 |
0.9313 |
0.0064 |
10.99 |
1000 |
0.2999 |
0.9258 |
0.0061 |
11.54 |
1050 |
0.2931 |
0.9313 |
0.0057 |
12.09 |
1100 |
0.2989 |
0.9313 |
0.0056 |
12.64 |
1150 |
0.2974 |
0.9313 |
0.0053 |
13.19 |
1200 |
0.3099 |
0.9258 |
0.005 |
13.74 |
1250 |
0.3131 |
0.9313 |
0.0049 |
14.29 |
1300 |
0.3201 |
0.9258 |
0.0046 |
14.84 |
1350 |
0.3109 |
0.9313 |
0.0045 |
15.38 |
1400 |
0.3168 |
0.9313 |
0.0043 |
15.93 |
1450 |
0.3226 |
0.9231 |
0.0042 |
16.48 |
1500 |
0.3234 |
0.9231 |
0.0041 |
17.03 |
1550 |
0.3283 |
0.9258 |
0.0039 |
17.58 |
1600 |
0.3304 |
0.9258 |
0.0038 |
18.13 |
1650 |
0.3321 |
0.9231 |
0.0037 |
18.68 |
1700 |
0.3362 |
0.9231 |
0.0036 |
19.23 |
1750 |
0.3307 |
0.9286 |
0.0035 |
19.78 |
1800 |
0.3357 |
0.9231 |
0.0034 |
20.33 |
1850 |
0.3244 |
0.9313 |
0.0033 |
20.88 |
1900 |
0.3497 |
0.9231 |
0.0032 |
21.43 |
1950 |
0.3443 |
0.9231 |
0.0031 |
21.98 |
2000 |
0.3398 |
0.9286 |
0.003 |
22.53 |
2050 |
0.3388 |
0.9286 |
0.003 |
23.08 |
2100 |
0.3399 |
0.9286 |
0.0029 |
23.63 |
2150 |
0.3548 |
0.9231 |
0.0028 |
24.18 |
2200 |
0.3475 |
0.9286 |
0.0028 |
24.73 |
2250 |
0.3480 |
0.9286 |
0.0027 |
25.27 |
2300 |
0.3542 |
0.9231 |
0.0026 |
25.82 |
2350 |
0.3589 |
0.9231 |
0.0026 |
26.37 |
2400 |
0.3449 |
0.9286 |
0.0025 |
26.92 |
2450 |
0.3604 |
0.9231 |
0.0025 |
27.47 |
2500 |
0.3493 |
0.9286 |
0.0024 |
28.02 |
2550 |
0.3631 |
0.9258 |
0.0024 |
28.57 |
2600 |
0.3590 |
0.9258 |
0.0023 |
29.12 |
2650 |
0.3604 |
0.9258 |
0.0023 |
29.67 |
2700 |
0.3667 |
0.9258 |
0.0022 |
30.22 |
2750 |
0.3571 |
0.9286 |
0.0022 |
30.77 |
2800 |
0.3660 |
0.9258 |
0.0021 |
31.32 |
2850 |
0.3638 |
0.9286 |
0.0021 |
31.87 |
2900 |
0.3729 |
0.9258 |
0.0021 |
32.42 |
2950 |
0.3706 |
0.9258 |
0.002 |
32.97 |
3000 |
0.3669 |
0.9286 |
0.002 |
33.52 |
3050 |
0.3740 |
0.9258 |
0.002 |
34.07 |
3100 |
0.3693 |
0.9286 |
0.002 |
34.62 |
3150 |
0.3700 |
0.9286 |
0.0019 |
35.16 |
3200 |
0.3752 |
0.9258 |
0.0019 |
35.71 |
3250 |
0.3753 |
0.9258 |
0.0019 |
36.26 |
3300 |
0.3721 |
0.9286 |
0.0018 |
36.81 |
3350 |
0.3764 |
0.9258 |
0.0018 |
37.36 |
3400 |
0.3758 |
0.9258 |
0.0018 |
37.91 |
3450 |
0.3775 |
0.9258 |
0.0018 |
38.46 |
3500 |
0.3812 |
0.9258 |
0.0018 |
39.01 |
3550 |
0.3817 |
0.9258 |
0.0017 |
39.56 |
3600 |
0.3815 |
0.9258 |
0.0017 |
40.11 |
3650 |
0.3825 |
0.9258 |
0.0017 |
40.66 |
3700 |
0.3852 |
0.9258 |
0.0017 |
41.21 |
3750 |
0.3854 |
0.9258 |
0.0017 |
41.76 |
3800 |
0.3823 |
0.9258 |
0.0016 |
42.31 |
3850 |
0.3829 |
0.9258 |
0.0016 |
42.86 |
3900 |
0.3873 |
0.9258 |
0.0016 |
43.41 |
3950 |
0.3842 |
0.9258 |
0.0016 |
43.96 |
4000 |
0.3857 |
0.9258 |
0.0016 |
44.51 |
4050 |
0.3873 |
0.9258 |
0.0016 |
45.05 |
4100 |
0.3878 |
0.9258 |
0.0016 |
45.6 |
4150 |
0.3881 |
0.9258 |
0.0016 |
46.15 |
4200 |
0.3888 |
0.9258 |
0.0016 |
46.7 |
4250 |
0.3891 |
0.9258 |
0.0016 |
47.25 |
4300 |
0.3878 |
0.9258 |
0.0016 |
47.8 |
4350 |
0.3890 |
0.9258 |
0.0016 |
48.35 |
4400 |
0.3890 |
0.9258 |
0.0015 |
48.9 |
4450 |
0.3895 |
0.9258 |
0.0015 |
49.45 |
4500 |
0.3896 |
0.9258 |
0.0015 |
50.0 |
4550 |
0.3894 |
0.9258 |
Framework Versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu113
- Datasets 2.7.0
- Tokenizers 0.13.2
đ License
This project is licensed under the Apache - 2.0 license.