🚀 sign_language_classification_v1
This model is a fine - tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It can be used to classify sign language, achieving an accuracy of 0.8056 on the evaluation set.
📚 Documentation
Model Information
Property |
Details |
Model Type |
Fine - tuned version of google/vit-base-patch16-224-in21k |
Metrics |
Accuracy |
License |
Apache-2.0 |
Evaluation Results
This model achieves the following results on the evaluation set:
- Loss: 1.3445
- Accuracy: 0.8056
Training Procedure
Training Hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e - 05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training Results
Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
3.2889 |
1.0 |
8 |
3.2714 |
0.0556 |
3.2492 |
2.0 |
16 |
3.2615 |
0.125 |
3.2263 |
3.0 |
24 |
3.2034 |
0.125 |
3.1271 |
4.0 |
32 |
3.1297 |
0.2083 |
2.9592 |
5.0 |
40 |
3.0655 |
0.2639 |
2.9414 |
6.0 |
48 |
2.9282 |
0.3472 |
2.7337 |
7.0 |
56 |
2.8254 |
0.4028 |
2.6683 |
8.0 |
64 |
2.6909 |
0.4583 |
2.5837 |
9.0 |
72 |
2.5904 |
0.5417 |
2.4566 |
10.0 |
80 |
2.5380 |
0.5833 |
2.2188 |
11.0 |
88 |
2.4682 |
0.5417 |
2.2885 |
12.0 |
96 |
2.3196 |
0.5833 |
2.005 |
13.0 |
104 |
2.2824 |
0.6667 |
1.9293 |
14.0 |
112 |
2.1967 |
0.6389 |
1.8396 |
15.0 |
120 |
2.0287 |
0.7361 |
1.7066 |
16.0 |
128 |
2.0357 |
0.7361 |
1.6911 |
17.0 |
136 |
1.9670 |
0.7361 |
1.6285 |
18.0 |
144 |
1.9186 |
0.7361 |
1.6064 |
19.0 |
152 |
1.9239 |
0.6944 |
1.6067 |
20.0 |
160 |
1.7723 |
0.7778 |
1.4094 |
21.0 |
168 |
1.7701 |
0.75 |
1.4664 |
22.0 |
176 |
1.7453 |
0.75 |
1.3255 |
23.0 |
184 |
1.7103 |
0.7083 |
1.3253 |
24.0 |
192 |
1.7216 |
0.7778 |
1.2416 |
25.0 |
200 |
1.5770 |
0.7778 |
1.1696 |
26.0 |
208 |
1.5099 |
0.7917 |
1.1645 |
27.0 |
216 |
1.4630 |
0.7917 |
1.0646 |
28.0 |
224 |
1.4989 |
0.7917 |
1.0149 |
29.0 |
232 |
1.5569 |
0.7222 |
1.0799 |
30.0 |
240 |
1.3602 |
0.8333 |
0.9528 |
31.0 |
248 |
1.3782 |
0.8472 |
1.0461 |
32.0 |
256 |
1.3698 |
0.8333 |
0.9019 |
33.0 |
264 |
1.3251 |
0.8611 |
0.9494 |
34.0 |
272 |
1.3586 |
0.8472 |
0.9439 |
35.0 |
280 |
1.3526 |
0.8333 |
0.9089 |
36.0 |
288 |
1.2728 |
0.8333 |
0.8962 |
37.0 |
296 |
1.3006 |
0.7917 |
0.9482 |
38.0 |
304 |
1.2592 |
0.8611 |
0.8804 |
39.0 |
312 |
1.3527 |
0.7778 |
0.8348 |
40.0 |
320 |
1.2759 |
0.8056 |
0.7823 |
41.0 |
328 |
1.3071 |
0.8194 |
0.8944 |
42.0 |
336 |
1.2428 |
0.8194 |
0.9677 |
43.0 |
344 |
1.2903 |
0.7778 |
0.9584 |
44.0 |
352 |
1.3119 |
0.7639 |
0.8342 |
45.0 |
360 |
1.3502 |
0.7778 |
0.7878 |
46.0 |
368 |
1.1941 |
0.8333 |
0.7817 |
47.0 |
376 |
1.2670 |
0.8056 |
0.812 |
48.0 |
384 |
1.2068 |
0.8194 |
0.9714 |
49.0 |
392 |
1.3480 |
0.75 |
0.9362 |
50.0 |
400 |
1.4028 |
0.7083 |
Framework Versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
📄 License
This project is licensed under the Apache - 2.0 license.